[comp.lang.misc] First Languages

eugene@pioneer.arpa (Eugene N. Miya) (01/23/88)

In article <1616@codas.att.com> karthur@codas.att.com (Kurt_R_Arthur) writes:
>I'm running the risk of another holy war here, but REXX (IBM's system command
>interpreter language for VM/CMS, now a part of SAA & due to be ported through
>out IBM's architectures) seems to be a very good 'first language' because it

I'm not partial to C as a first language, and I know all the BASIC and
APL interpreter types are also out there.  Interaction and real-time is
a really important attribute.  I'm becoming more and more convinced that
lexically scoped, state-based systems aren't good first languages.  (I
think Mike Lesk made some good comments about this at the last
Washington Usenix.) Some of these new computer graphics animation
systems are seemingly good FIRST-time systems, and offer some real hope
of getting BASIC (and LOGO) "out of there."

From the Rock of Ages Home for Retired Hackers:

--eugene miya, NASA Ames Research Center, eugene@ames-aurora.ARPA
  "You trust the `reply' command with all those different mailers out there?"
  "Send mail, avoid follow-ups.  If enough, I'll summarize."
  {uunet,hplabs,hao,ihnp4,decwrl,allegra,tektronix}!ames!aurora!eugene

kers@otter.HP.COM (Christopher Dollin) (01/25/88)

Well here we are on a "let's plug our favourite language" trip ...

As far as teaching programming languages go, Sussex University use Pop11 as
the language taught to students at their Cognitive Studies Department. Many
of these students are not programmers, and Pop11 is intended to be accessible
to them.

Pop11 is a descendant of Pop2, which at least one other poster has mentioned.
A recipe for constructing Pop11 might run ...

    Take the syntactic approach of Pascal [or Algol68], the data-type
    philosophy of Lisp, the open stack approach of Forth. Simmer well, testing
    frequently on novices. Serve. Update as necessary.

Me? I LOVE it!


Regards,
Kers                                    | "Why Lisp if you can talk Poperly?"

csrdi@its63b.ed.ac.uk (Janet: csrdi@edinburgh.its63b) (01/29/88)

Well, at Edinburgh we've been using Pascal for the last few years, but there
is now a movement afoot to use ML as the first teaching language. I know that
this year's first year will be getting introduced to ML quite shortly.

My own experience of learning ML (in second year) had quite an effect on my
programming style. Learning Prolog also affected it - I think in both cases for
the better. Functional languages certainly give a greater appreciation of the
computational process, I think, than procedural languages do. If you want an
understanding of what's happening at that level, something like ML is a very
good first language.

	--Rick.

-- 
Janet: csrdi@edinburgh.its63b
BITNET: csrdi%uk.ac.ed.its63b@UKACRL
ARPA: csrdi@its63b.ed.ac.uk
UUCP: not recommended - we pay real money for it!

"Life would be so much easier if everyone read the manual."

pcm@iwarpo3.intel.com (Phil C. Miller) (02/02/88)

In article <932@its63b.ed.ac.uk> RDI@uk.ac.ed.ecsvax (Rick Innis, CS4) writes:

>Well, at Edinburgh we've been using Pascal for the last few years, but there
>is now a movement afoot to use ML as the first teaching language. I know that
>this year's first year will be getting introduced to ML quite shortly.

This is a really interesting movement you're discussing.  I have used
ML in the classroom, have even implemented a (subset) compiler.  So
far, the only context in which I have seen ML is the university
environment.

This is one of those situations in which a well-intentioned effort
should probably be directed a bit more toward the mainstream.  Perhaps
a language like C++ or Ada (which are languages in widespread use in
industry) would be better than a language which is, in effect, an
experimental language.  I have just changed jobs.  In no interview was
I asked whether I knew ML; in every interview I was asked if I knew C.

>My own experience of learning ML (in second year) had quite an effect on my
>programming style. Learning Prolog also affected it - I think in both cases for
>the better. Functional languages certainly give a greater appreciation of the
>computational process, I think, than procedural languages do. If you want an
>understanding of what's happening at that level, something like ML is a very
>good first language.

Part of the reasoning behind a technical education should be to prepare
a student for the working experience.  I contend that exposing
first-year students to a functional programming language does not fit
that role.

Phil Miller
Opinions expressed are mine, not necessarily my employer's, etc.

shirono@grasp.cis.upenn.edu (Roberto Shironoshita) (02/03/88)

In article <2781@omepd> pcm@iwarpo3.UUCP (Phil C. Miller) writes:
> In article <932@its63b.ed.ac.uk> RDI@uk.ac.ed.ecsvax (Rick Innis, CS4) writes:
> >[re: a movement in Edinburgh to use ML as a first language]
>
> [ML only seen in university environment]
>
> [Some text deleted]
> Perhaps a language like C++ or Ada (which are languages in
> widespread use in industry) would be better than a language which
> is, in effect, an experimental language.
> [More text deleted]
>
> >[Functional languages are better than procedural]
>
> Part of the reasoning behind a technical education should be to prepare
> a student for the working experience.  I contend that exposing
> first-year students to a functional programming language does not fit
> that role.

The crucial word here is _technical_.  The question that arises is
whether a particular Computer Science program is preparing students
to go into industry, or whether it is preparing them to go to graduate
school and then to research.  Some programs stress the often-called
"real world," whereas others stress the theory behind CS.  I believe a
well-rounded BSE program should prepare students for both.

Thus, both procedural and functional languages should be introduced in
the first year.  This doesn't mean that students should know all the
ins and outs of either kind of language by their sophomore year; quite
on the contrary, they shouldn't be burdened so.  They should, however,
be able to cope with both kinds.


                                   Roberto Shironoshita

-----------------------------------------------------------------------
Disclaimer 1:  The opinions expressed here are my own.  The University
	       need not share them, or even be aware of them.
Disclaimer 2:  Like most humans, I'm bound to err at times.  I believe
	       what I have said, but agree that I may be wrong.

  @@@@@@@@@\         Full Name:    Roberto Shironoshita
   @@     @@         Occupation:   BSE candidate in Computer Science
   @@     @@         Organization: University of Pennsylvania
   @@@@@@@@/
   @@                Network Address:
   @@                    PENNnet:  shirono@eniac.seas
  @@@@                   Internet: shirono@eniac.seas.upenn.edu

pds@quintus.UUCP (Peter Schachte) (02/04/88)

In article <2781@omepd>, pcm@iwarpo3.intel.com (Phil C. Miller) writes:
> Part of the reasoning behind a technical education should be to prepare
> a student for the working experience.  I contend that exposing
> first-year students to a functional programming language does not fit
> that role.

No, perhaps not (although the first language I learned in college was
Lisp, which was also the language I programmed in in my first job after
college).  Many, probably most, CS graduates will wind up programming
in C, or Ada, or FORTRAN, or assembler.  But this does not mean that
those languages have to be taught in first year college courses.  There
is plenty of time in a four year CS program to teach all the practical
tools.  It's more important to teach students concepts of algorithms and
data structures and correctness and documentation and structured design
than to teach individual programming languages.

Starting a first year student off with a language like C or FORTRAN or
Ada or assembler is like teaching an infant to swim by throwing him into
the ocean.  It may work, but the mortality rate seems a bit high.  Start
him off in calmer waters until he's strong enough to handle the ocean.
And while he's in calm waters, you can concentrate on teaching him
technique.
-- 
-Peter Schachte
pds@quintus.uucp
...!sun!quintus!pds

robison@uiucdcsb.cs.uiuc.edu (02/05/88)

I've heard that MIT teachs Scheme and CLU.  Since these are not mainstream 
languages, obviously no one hires MIT graduates.

- Arch

debray@arizona.edu (Saumya Debray) (02/05/88)

In article <2781@omepd>, pcm@iwarpo3.intel.com (Phil C. Miller) writes:
> Part of the reasoning behind a technical education should be to prepare
> a student for the working experience.  I contend that exposing
> first-year students to a functional programming language does not fit
> that role.

Assuming you're not referring to two-year trade schools that crank out
programmers, I disagree.  In my opinion, a primary purpose of a CS
degree program is to teach students the basic principles of computation.
I feel that at the early stages, this is best done using a declarative
language.  Starting out with high-level assemblers like C or Basic can all
too easily damage students' brains to a point where they have a hard time
grasping any concept not directly available in these languages.

If your student progresses beyond the first year, he'll presumably
encounter languages like C; with early exposure to declarative languages,
he should have the background to use these in a disciplined way.  If he
doesn't progress beyond the first year, of course, the point is moot.

> [ ... ]  I have just changed jobs.  In no interview was I asked
> whether I knew ML; in every interview I was asked if I knew C.

A lot of interviewers are idiots who'll also ask you whether you've worked
on machine PQR and operating system XYZ.  Do you feel that, in "preparing a
student for the work experience", schools should dispense with computer
architecture courses, in favor of training on a dozen different machines?
-- 
Saumya Debray		CS Department, University of Arizona, Tucson

     internet:   debray@arizona.edu
     uucp:       {allegra, cmcl2, ihnp4} !arizona!debray

esh@otter.hple.hp.com (Sean Hayes) (02/05/88)

In article <2781@omepd>, pcm@iwarpo3.intel.com (Phil C. Miller) writes:
> Part of the reasoning behind a technical education should be to prepare
> a student for the working experience.  I contend that exposing
> first-year students to a functional programming language does not fit
> that role.

It should also be the goal of good educational establishments to promote
technologies that stand a good chance of being used in the future.
To that end I was taught C and UNIX when nobody in Industry wanted it, and
in fact I would contend that the reason Industry wants it now is because it
was taught to undergraduates who now make up the bulk of Industry.

(1/2 :-)
It is entirely reasonable to expose first years to declarative languages 
in their first year,
especially when most of them have been brain damaged by hacking in BASIC and
Assembler on their whizz-bang micros.
-----------
|Sean Hayes,          Hewlett Packard Laboratories,      Bristol,  England|
|net: esh@hplb.uucp   esh%shayes@hplabs.HP.COM       ..!mcvax!ukc!hplb!esh|

vic@zen.UUCP (Victor Gavin) (02/08/88)

I personally feel that the teaching of a non-procedural language in the first
year of a computing science course is an excellent idea.

One of the main problems that I have noticed in classes is all the snotty
nosed kids who have ``learned'' to program at home on their BASIC computers,
*all by themselves*.

Unfortunately their methods and practices usually stink.

This means that before you can teach them Good Habits, you have to unlearn
them of their existing Bad Habits.

By forcing them to use a language which doesn't allow them access to their old
habits you make it easier to show them what you are talking about. Then when
you start to show them how to program in procedural languages they may be a bit
easier to handle.

		vic
--
Victor Gavin						Zengrange Limited
vic@zen.co.uk						Greenfield Road
..!mcvax!ukc!zen.co.uk!vic				Leeds LS9 8DB
+44 532 489048						England

pcm@iwarpo3.intel.com (Phil C. Miller) (02/09/88)

In article <3730@megaron.arizona.edu> debray@arizona.edu (Saumya Debray) writes:
>In article <2781@omepd>, pcm@iwarpo3.intel.com (Phil C. Miller) writes:
>> Part of the reasoning behind a technical education should be to prepare
>> a student for the working experience.  I contend that exposing
>> first-year students to a functional programming language does not fit
>> that role.
>
>Assuming you're not referring to two-year trade schools that crank out
>programmers, I disagree.  In my opinion, a primary purpose of a CS
>degree program is to teach students the basic principles of computation.

I cannot agree that languages like Prolog and ML teach students the
basic principles of computation.  Prolog is quite foreign to the
underlying structure of existing (von Neumann) architectures.  ML, as a
functional programming language, enforces a programming style which is
(1) inefficient; (2) not in widespread use; (3) not compatible with
most algorithms which appear in the literature.  Seems to me that any
of these are an adequate counter-argument to use of ML as an
introductory language.  Prolog, while a fine language for AI/DBMS
applications, is simply not a general-purpose language.


>I feel that at the early stages, this is best done using a declarative
>language.  Starting out with high-level assemblers like C or Basic can all
>too easily damage students' brains to a point where they have a hard time
>grasping any concept not directly available in these languages.

The generalization inferred by this is that any programmer not exposed on
day one to declarative languages is brain-damaged.  This covers 99.99999%
of programmers.  Too bad, if only Donald Knuth and Per Brinch Hansen had seen
ML before they were ruined.

>If your student progresses beyond the first year, he'll presumably
>encounter languages like C; with early exposure to declarative languages,
>he should have the background to use these in a disciplined way.  If he
>doesn't progress beyond the first year, of course, the point is moot.

This brings us full circle to the original point: what should be the first 
language a student is taught.  Most universities use Pascal or C, which at
least offer the twin advantages of (1) widespread use and (2) some
comprehensible relationship to the underlying architecture.

>> [ ... ]  I have just changed jobs.  In no interview was I asked
>> whether I knew ML; in every interview I was asked if I knew C.
>
>A lot of interviewers are idiots who'll also ask you whether you've worked
>on machine PQR and operating system XYZ.  Do you feel that, in "preparing a
>student for the work experience", schools should dispense with computer
>architecture courses, in favor of training on a dozen different machines?

I obviously said nothing of the kind, directly or indirectly.

Questioning an applicant regarding his/her level of expertise in a
particular programming language gives an interviewer some assessment of
the applicant's potential for contribution TO AN EXISTING SOFTWARE
METHODOLOGY.  In other words, given that a company needs someone to
work on Unix in C for a project which may already be behind schedule,
they merely wish to know how long it will be before the applicant can
contribute.

I'll close my comments (and hopefully terminate the religious tirade of
ML fans) by stating that declarative languages have a place in computer
science education.  I happen to feel that the place is graduate school
and not undergraduate.

One minor point I should make here: I have formally studied both ML and 
Prolog in graduate school, and I think both are fine languages.  I just
don't think a first-year student see them as their first language, just
like I don't think Latin is a good language to teach your children when
they are first learning to talk: you teach them the language most commonly
in use in their culture.


>-- 
>Saumya Debray		CS Department, University of Arizona, Tucson
>
>     internet:   debray@arizona.edu
>     uucp:       {allegra, cmcl2, ihnp4} !arizona!debray

Phil Miller
Only I am responsible for my opinions.

markv@uoregon.UUCP (Mark VandeWettering) (02/09/88)

In article <2809@omepd> pcm@iwarpo3.UUCP (Phil C. Miller) writes:
>In article <3730@megaron.arizona.edu> debray@arizona.edu (Saumya Debray) writes:
>>In article <2781@omepd>, pcm@iwarpo3.intel.com (Phil C. Miller) writes:
>>> Part of the reasoning behind a technical education should be to prepare
>>> a student for the working experience.  I contend that exposing
>>> first-year students to a functional programming language does not fit
>>> that role.

	If this is so, then I would expect to receive an education (if
	it can be called that) in C, Pascal, Fortran and Assembler.
	While each of these languages may have its own usage, they are
	really not all that difficult.  The important thing is
	PROGRAMMING WELL, not a particular language.

>>Assuming you're not referring to two-year trade schools that crank out
>>programmers, I disagree.  In my opinion, a primary purpose of a CS
>>degree program is to teach students the basic principles of computation.

	And to show them that there are many ways to view computation,
	as a functional machine with simple substitution semantics,
	logic programming, traditional von Neumann machines, object
	oriented programming.  I have applauded the Abelson and Sussman
	because they admirably achieve this goal in Structure and
	Interpretation of Computer Programs, which was recently adopted
	as the text for our undergraduate program here at the U of
	Oregon.

>I cannot agree that languages like Prolog and ML teach students the
>basic principles of computation.  Prolog is quite foreign to the
>underlying structure of existing (von Neumann) architectures.  ML, as a
>functional programming language, enforces a programming style which is
>(1) inefficient; (2) not in widespread use; (3) not compatible with
>most algorithms which appear in the literature.  Seems to me that any
>of these are an adequate counter-argument to use of ML as an
>introductory language.  Prolog, while a fine language for AI/DBMS
>applications, is simply not a general-purpose language.

	You make several points here which are questionable.  The
	efficiency argument has two sides: a language may encourage
	inefficient solutions, or the implementation of a language may
	be itself inefficient.  The first is true of any language; it is
	possible to write inefficient programs in almost any language.
	The second is being dealt with by more effective and efficient
	compilers.   Part of the sucess is due to the fact that
	languages like Scheme and ML (and Miranda and SASL, thanks David
	Turner) have a clean semantics that allow you to reason formally
	about how compilation should proceed.

	"Not in widespread use" is a questionable argument at best.
	Fortran is in wide use, but I would shoot anyone who taught it.
	C, well, I find it a necessary evil, but anticipate C++ as
	taking its place shortly (like when g++ is stable).  Languages
	like C and Pascal and Fortran are all really not all that
	different, and not that hard to pick up once you have a good
	grasp of basic programming skills.

	Your last argument is perhaps the most persuasive, there doesn't
	seem to be a good body of literature of how to solve traditional
	problems in a functional style.  This doesn't mean that someone
	shouldn't do so, however, and I anticipate greater work to be
	done in this area.  In the mean time, a language such as Scheme
	allows you to at least get a taste of a language with pure
	semantics, and still allows programming in a traditional
	procedural sense.

>>I feel that at the early stages, this is best done using a declarative
>>language.  Starting out with high-level assemblers like C or Basic can all
>>too easily damage students' brains to a point where they have a hard time
>>grasping any concept not directly available in these languages.
	
	Emphasize computation, and how it may be accomplished, not how a
	particular LANGUAGE or IMPLEMENTATION may be accomplished.

>The generalization inferred by this is that any programmer not exposed on
>day one to declarative languages is brain-damaged.  This covers 99.99999%
>of programmers.  Too bad, if only Donald Knuth and Per Brinch Hansen had seen
>ML before they were ruined.

	Donald Knuth is an amazing man, and perhaps in 100 lifetimes I
	might accomplish 1% of what he has done, but he is hardly a
	sparkling example of a programmer.  If you need a chuckle, read
	the intro to volume 1 of The Art of Computer Programming, in
	which he explains his MIX machine, and how all programs will
	have to be written in assembler.  These texts, while excellent
	in many way, are considerably less useful because they are too
	specifically tied to a particular form of programming.

>>If your student progresses beyond the first year, he'll presumably
>>encounter languages like C; with early exposure to declarative languages,
>>he should have the background to use these in a disciplined way.  If he
>>doesn't progress beyond the first year, of course, the point is moot.

	Agreed.  Don't cater to people who have passing interest in CIS,
	teach something of the science that computer science is rapidly
	developing.  By teaching "useful" skills, you breed a generation
	of programmers that believe that they understand computation,
	but whose fundamental knowledge of computers is constrained by
	the language that they use to express it.

>This brings us full circle to the original point: what should be the first 
>language a student is taught.  Most universities use Pascal or C, which at
>least offer the twin advantages of (1) widespread use and (2) some
>comprehensible relationship to the underlying architecture.

	I know of very few people who have gone through low level CIS
	program that could tell me with any real conviction how the
	underlying architecture supported C or Pascal execution.  If
	this is true, then why bother teaching them?  If they have no
	idea of how a computer works, why teach them of a register
	machine?  (Yeah, I knowk, that's how they work, but they
	probably won't understand that until much later)

>>> [ ... ]  I have just changed jobs.  In no interview was I asked
>>> whether I knew ML; in every interview I was asked if I knew C.

	Well, if ya asked me how I earned my bread and butter, I
	wouldn't say ML.  But we are talking about ideals here (or at
	least I am).

>>A lot of interviewers are idiots who'll also ask you whether you've worked
>>on machine PQR and operating system XYZ.  Do you feel that, in "preparing a
>>student for the work experience", schools should dispense with computer
>>architecture courses, in favor of training on a dozen different machines?

	JCL?  Fortran?  "Just say no!" :-)

>I obviously said nothing of the kind, directly or indirectly.
>
>Questioning an applicant regarding his/her level of expertise in a
>particular programming language gives an interviewer some assessment of
>the applicant's potential for contribution TO AN EXISTING SOFTWARE
>METHODOLOGY.  In other words, given that a company needs someone to
>work on Unix in C for a project which may already be behind schedule,
>they merely wish to know how long it will be before the applicant can
>contribute.
	
	If they add someone to a project that is behind, it will get
	further behind.  Noone can be dropped into a vacuum and expected
	to churn code from day one.

>I'll close my comments (and hopefully terminate the religious tirade of
>ML fans) by stating that declarative languages have a place in computer
>science education.  I happen to feel that the place is graduate school
>and not undergraduate.

	It depends.   I would say they have not much of a place in an education
	which is deemed to be practical (although I reserve the right to
	change my mind, I am beginning to believe some more of the
	declarative doctrine).  You are not going to get much call for
	ML programming, which probably means your time could be better
	spent.

>One minor point I should make here: I have formally studied both ML and 
>Prolog in graduate school, and I think both are fine languages.  I just
>don't think a first-year student see them as their first language, just
>like I don't think Latin is a good language to teach your children when
>they are first learning to talk: you teach them the language most commonly
>in use in their culture.
	
	For interesting reading on "first language?" questions, try
	SIGPLAN Notices vol 22, number 3, march 87

		"A Critique of Abelson and Sussman or why Calculating is
		better than Scheming" by Philip Wadler
	
	and of course Abelson and Sussman.
>>-- 
>>Saumya Debray		CS Department, University of Arizona, Tucson
>Phil Miller
mark vandewettering, university of oregon cis department

anw@nott-cs.UUCP (02/10/88)

In article <1154@zen.UUCP> vic@zen.UUCP (Victor Gavin) writes:
>One of the main problems that I have noticed in classes is all the snotty
								    ^^^^^^
>nosed kids who have ``learned'' to program at home on their BASIC computers,
 ^^^^^
>*all by themselves*.

	We obviously have a better class of student than you do.  Why knock
    people who are motivated enough to do something "all by themselves"?

>Unfortunately their methods and practices usually stink.

	We obviously have a better class of student than you do.

>This means that before you can teach them Good Habits, you have to unlearn
>them of their existing Bad Habits.
>
>By forcing them to use a language which doesn't allow them access to their old
>habits you make it easier to show them what you are talking about. [...]

	This is pedagogically disastrous.  "Forcing" them to do *anything*
    will ensure that they treat you as the opposition.  They will be saying
    "In Basic, we could do so-and-so, why can't we do it in your supposedly
    superior language?".  You have to demonstrate that your way is superior,
    not ram down their throats what pathetic, snivelling creatures they are
    for knowing only Basic.

	The problem is that, in the early stages, your way almost certainly
    isn't superior.  All of the standard elementary computing problems --
    solve a quadratic, partition an array, draw a graph, count the number
    of words in this sentence, etc. -- go just as well and as easily in
    Basic, structured or not, as in [name your favourite language], and
    they know Basic better than [nyfl].  Further, their Basic on a home
    computer very likely runs rings around [nyfl] on [nyf computer] when
    it comes to graphics, sound, games, etc.  Why should they be interested
    in [nyfl]?  The answer comes much later, when they move up from toy
    programs to serious programs, and when software maintenance becomes a
    serious problem to them.  But you can't -- and shouldn't -- rush the
    process.  Just do things your way, let them do things their way,
    help them constructively when they get into trouble, and let osmosis
    work in its own good time.

csrdi@its63b.ed.ac.uk (Janet: rick@uk.ac.ed) (02/10/88)

In article <2781@omepd> pcm@iwarpo3.UUCP (Phil C. Miller) writes:
>> RDI@uk.ac.ed.ecsvax (ME!) writes:
>> [my comments about shifting to ML as a first language]
>
>This is a really interesting movement you're discussing..[deleted]..So
>far, the only context in which I have seen ML is the university
>environment.
>
It seems the Ministry of Defence are now asking for programs specs to be
written in ML. Pity the programs then have to be written in ADA.(spit)

It seems the movement is now towards Modula-2 as a first teaching language,
but ML will continue to be taught in first year, in the second half of the
course. 
>
>> [My comments about learning ML and Prolog affecting programming style] 
>
>Part of the reasoning behind a technical education should be to prepare
>a student for the working experience.  I contend that exposing
>first-year students to a functional programming language does not fit
>that role.

A friend of mine commented in reply to my earlier message that it's another
application of the Sapir-Whorf hypothesis - a language that doesn't affect the
way you think about prgramming isn't worth knowing.  My contention is that if
you can't learn from something different you are sadly lacking somewhere.

As for preparing students for the working experience, wasn't that discussed
here last year too? If you know *how* to learn then you can do almost
anything within your own limits. Sadly, too many universities (this one
included) take the view that they are here to turn out people with vocational
qualifications rather than an ability to perceive and learn. Putting it
another way, universities are places for getting a degree, not an education.
(I'll leave this track before I start getting political, as I know who I blame
for this state of affairs.)

	--Rick.
-- 
Janet: rick@uk.ac.ed
BITNET: rick%uk.ac.ed@UKACRL
ARPA: rick@ed.ac.uk
UUCP: rick%uk.ac.ed%ukc@mcvax

"Life would be so much easier if everyone read the manual."

robison@uiucdcsb.cs.uiuc.edu (02/11/88)

Re: Knuth and MIX

The use of the arcane MIX code in The Art of Computer Programming reflects
the ancient times in which it was written.  I've heard that Knuth is rewriting
The Art of Computer Programming with Pascal in place of MIX.  He has also 
published a book of TeX's source code with commentary - in this he is certainly
a sparkling example of a programmer.


Arch D. Robison
University of Illinois at Urbana-Champaign
	
CSNET: robison@UIUC.CSNET
UUCP: {ihnp4,pur-ee,convex}!uiucdcs!robison
ARPA: robison@B.CS.UIUC.EDU (robison@UIUC.ARPA)

mmh@ivax.doc.ic.ac.uk (Matthew Huntbach) (02/12/88)

In article <975@its63b.ed.ac.uk> csrdi@itspna (R.Innis) writes:
>anything within your own limits. Sadly, too many universities (this one
>included) take the view that they are here to turn out people with vocational
>qualifications rather than an ability to perceive and learn. Putting it
>another way, universities are places for getting a degree, not an education.
>(I'll leave this track before I start getting political, as I know who I blame

Surely the best thing is that prospective students should have a choice of
courses some offering a route to programming via conventional languages,
others via experimental/sounder languages. It should be made clear to the
students what is on offer and the arguments both ways. The level of demand
would dictate the balance between the two.

wcs@ho95e.ATT.COM (Bill.Stewart) (02/14/88)

In article <3730@megaron.arizona.edu> debray@arizona.edu (Saumya Debray) writes:
>	> I contend that exposing first-year students to a functional
>	> programming language does not fit that role.
>
>Assuming you're not referring to two-year trade schools that crank out
>programmers, I disagree.  In my opinion, a primary purpose of a CS
>degree program is to teach students the basic principles of computation.
	Remember that 1st-year students, even CS students, are studying
	more than just CS100, and, if possible, they should have *some*
	usable programming knowledge as soon as possible.  While it's
	probably a Bad Thing to expose them to BASIC, whatever
	functional language you teach them had better be adequate for
	doing chemistry and physics homework, numerical integration for
	calculus, statistics for their psych classes, and the like.

	If you have to do this by providing standard library functions
	in your Random-Lisp distribution, fine.  But if you don't do
	it, they'll be off writing scurvy hacks in RPN or BASIC or
	Lotus Macro Language, learning all the bad habits you're
	trying to shield them from.  And somewhere along the lines,
	engineering students will *have* to learn Fortran, if only so
	they can do interesting *engineering* research without having
	to rewrite EISPAK and its hench-programs.

>	> [ ... ]  I have just changed jobs.  In no interview was I asked
>	> whether I knew ML; in every interview I was asked if I knew C.
> [ good negative comment about interviewers ]
	The college I went to didn't offer COBOL, either, though it's
	probably useful in a "how the other half lives" survey course.
	I've been meaning to learn Zetalisp to upgrade my resume.
	(it starts with "Ada" or "Algol", depending on the interviewer.
	Any real interviewer will want to know what I *really* can do,
	but to interview at larger companies you may need the buzzword list
	there so the personnel people will pass it on to the people you
	really care about.)
-- 
#				Thanks;
# Bill Stewart, AT&T Bell Labs 2G218, Holmdel NJ 1-201-949-0705 ihnp4!ho95c!wcs

wcs@ho95e.ATT.COM (Bill.Stewart) (02/14/88)

In article <2809@omepd> pcm@iwarpo3.UUCP (Phil C. Miller) writes:
:The generalization inferred by this is that any programmer not exposed on
:day one to declarative languages is brain-damaged.  This covers 99.99999%
:of programmers.  Too bad, if only Donald Knuth and Per Brinch Hansen had seen
:ML before they were ruined.
	I'd pay a lot for a version of Knuth's books with the algorithms
	translated into a reasonable language instead of MIX.  I don't
	mean lisp, since his points about knowing what the machine's
	really doing are correct, but Algol had been around for more
	than 10 years, and it was *designed* for writing algorithms
	readably.  Even his "English-language" descriptions are mostly
	spaghetti-code.
-- 
#				Thanks;
# Bill Stewart, AT&T Bell Labs 2G218, Holmdel NJ 1-201-949-0705 ihnp4!ho95c!wcs

shebs%defun.utah.edu.uucp@utah-cs.UUCP (Stanley T. Shebs) (02/15/88)

In article <170500012@uiucdcsb> robison@uiucdcsb.cs.uiuc.edu writes:

>He [Knuth] has also 
>published a book of TeX's source code with commentary - in this he is certainly
>a sparkling example of a programmer.

The TeX source code is a sparkling example of documentation, but the code
proper is nothing to write home about.  Control flow is scrambled, there
are zillions of strange little temporary variables, and side effects run
rampant.  If you don't believe it, try picking a feature from the TeXbook
and finding out who is responsible for its behavior...

I suspect that TeX is 1/4 to 1/3 larger and more complicated than necessary;
would be interesting to recode it in a purely functional language and see
how and if the complexity is reduced.

>Arch D. Robison

							stan shebs
							shebs@cs.utah.edu

nevin1@ihlpf.ATT.COM (00704a-Liber) (02/16/88)

In article <1982@ho95e.ATT.COM> wcs@ho95e.UUCP (46323-Bill.Stewart,2G218,x0705,) writes:
.	Remember that 1st-year students, even CS students, are studying
.	more than just CS100, and, if possible, they should have *some*
.	usable programming knowledge as soon as possible.  While it's
.	probably a Bad Thing to expose them to BASIC, whatever
.	functional language you teach them had better be adequate for
.	doing chemistry and physics homework, numerical integration for
.	calculus, statistics for their psych classes, and the like.

I like the catch phrase 'usable programming knowledge' :-).  1st year students
usually do not need anything more than a calculator for intro courses in chem,
physics, calc, psych, etc.  Take calculus, for example.  Most of the
integration problems are not numerical but symbolic.  If a 1st year student can
write a sufficiently complex (another catch phrase :-)) symbolic integration
program in BASIC, then there is no need to teach him BASIC in CS100.  Let him
get on to what CS is all about (vs what the field of numeric calculation is all
about).  And if he is just learning programming for the first time he won't be
able to program that symbolic integration program in time for it to make a
difference in his other courses.  (Personally, I'd buy an HP-28S and learn to
program that; it is a lot easier to drag it around than an IBM PC and most of
the needed software is already built in.  Then again, this assumes that I
already know how to program a little to start with).

.	If you have to do this by providing standard library functions
.	in your Random-Lisp distribution, fine.  But if you don't do
.	it, they'll be off writing scurvy hacks in RPN or BASIC or
.	Lotus Macro Language, learning all the bad habits you're
.	trying to shield them from.  And somewhere along the lines,
.	engineering students will *have* to learn Fortran, if only so
.	they can do interesting *engineering* research without having
.	to rewrite EISPAK and its hench-programs.

If they already know how to write the hacks in RPN or BASIC there is no need to
teach them those languages.  And engineers DO NOT have to learn FORTRAN!!
Unless you are doing much numeric processing (and even then there are other
languages you can do it in), there are enough software packages out on the
market so that you never have to write programs on that low a level.  (I am an
engineer and the only reason I know FORTRAN is because during a summer job I
had to interface to some graphics routines and my choice of languages were PL/I
and FORTRAN, and all I could find was the FORTRAN manual.)

.>	> [ ... ]  I have just changed jobs.  In no interview was I asked
.>	> whether I knew ML; in every interview I was asked if I knew C.
.> [ good negative comment about interviewers ]
.	The college I went to didn't offer COBOL, either, though it's
.	probably useful in a "how the other half lives" survey course.
.	I've been meaning to learn Zetalisp to upgrade my resume.
.	(it starts with "Ada" or "Algol", depending on the interviewer.
.	Any real interviewer will want to know what I *really* can do,
.	but to interview at larger companies you may need the buzzword list
.	there so the personnel people will pass it on to the people you
.	really care about.)

And no one is saying that you won't have that buzzword list by the time you
finish college.  Why do you need it starting with CS100??  On my resume, I have
no less than 20 languages.  Interviewers would ask me what my favorite
language is.  I would tell them Icon, and would then explain why I liked it,
what it was good for, and what it was NOT good for.  And Icon is not exactly a
buzzword for most jobs.  Interviewers want to know what you can do, and not so
much what you have already done.  Learn the good programming habits first; then
pick up all the 'fad' languages.  It will pay off a lot better in the end.
-- 
 _ __			NEVIN J. LIBER	..!ihnp4!ihlpf!nevin1	(312) 510-6194
' )  )				"The secret compartment of my ring I fill
 /  / _ , __o  ____		 with an Underdog super-energy pill."
/  (_</_\/ <__/ / <_	These are solely MY opinions, not AT&T's, blah blah blah

dwh@twg-ap.UUCP (Dave Hamaker) (02/18/88)

In article <545@tuck.nott-cs.UUCP>, anw@nott-cs.UUCP writes:
> ... This is pedagogically disastrous.  "Forcing" them to do *anything*
> will ensure that they treat you as the opposition....
and other wise things which express an attitude that teachers should lead
by example rather than ideological edict.

Hear! Hear!

john@frog.UUCP (John Woods, Software) (02/19/88)

In article <545@tuck.nott-cs.UUCP>, anw@nott-cs.UUCP writes:
>In article <1154@zen.UUCP> vic@zen.UUCP (Victor Gavin) writes:
>>One of the main problems that I have noticed in classes is all the snotty
> 								     ^^^^^^
>>nosed kids who have ``learned'' to program at home on their BASIC computers,
> ^^^^^
>>*all by themselves*.
> 	We obviously have a better class of student than you do.  Why knock
>     people who are motivated enough to do something "all by themselves"?
> >Unfortunately their methods and practices usually stink.
> 	We obviously have a better class of student than you do.

Indeed, I learned BASIC "all by myself" in early high school (though I learned
it on the school's computer).  I also became quite a fan of Warnier-Orr
diagrams, and thus learned to structure code in a language that had no concept
of it (Dartmouth-style BASIC, none of this extension nonsense).

>     ...[name your favourite language]... Further, their Basic on a home
>     computer very likely runs rings around [nyfl] on [nyf computer] when
>     it comes to graphics, sound, games, etc.

Their machine might even run rings around [nyf computer] in terms of speed:
as a Freshman at MIT, I wrote a toy LISP interpreter in BASIC (largely to
irritate those who asserted that BASIC was on a lower plane of existance)
on a TRS-80 that a friend of mine had.  He then proceeded to USE that LISP
for assignments during the last week of school, because the load average on
MIT-MULTICS had passed infinite about a week before that... :-)

If you can show people that your concepts help ease programming, those who
will ever pick it up will do so, and happily; those who don't, probably
would not anyway.  If you show people that they are insignificant little worms
for ever having learned something that you disapprove of, you'll just
preach to the converted.


--
John Woods, Charles River Data Systems, Framingham MA, (617) 626-1101
...!decvax!frog!john, ...!mit-eddie!jfw, jfw@eddie.mit.edu

"Cutting the space budget really restores my faith in humanity.  It
eliminates dreams, goals, and ideals and lets us get straight to the
business of hate, debauchery, and self-annihilation."
		-- Johnny Hart

pcm@iwarpo.intel.com (Phil C. Miller) (02/21/88)

This discussion has been going on long enough so that it's getting tough to
say who is saying what.  Therefore, I'll preface each person's comments with
their name instead of a cryptic series of punctuation characters.

nevin1> refers to nevin1@ihlpf.UUCP (00704a-Liber,N.J.)
wcs>    refers to wcs@ho95e.UUCP (46323-Bill.Stewart,2G218,x0705,) writes)
pcm>    refers to me (Phil Miller, pcm@iwarpo.UUCP).

wcs>.	Remember that 1st-year students, even CS students, are studying
wcs>.	more than just CS100, and, if possible, they should have *some*
wcs>.	usable programming knowledge as soon as possible.  While it's
wcs>.	probably a Bad Thing to expose them to BASIC, whatever
wcs>.	functional language you teach them had better be adequate for
wcs>.	doing chemistry and physics homework, numerical integration for
wcs>.	calculus, statistics for their psych classes, and the like.

Good point, Bill.  This is generally the point I was trying to make.

nevin1> 1st year students usually do not need anything more than a 
nevin1> calculator for intro courses in chem, physics, calc, psych, etc.

I think you are dating yourself.  First year science courses were
starting to make serious use of computers 15 years ago when I was a
college freshman.  As computing resources become more widely used in
universities, students will need to know more and more about
programming.

Incidentally, I don't happen to feel that's a good thing.  It would be
nice if a physicist (for example) could stick primarily to physics and
not have to learn a second discipline (computer science).  Hopefully,
the legacy of our generation of programmers will be a considerable
simplification in computer user interfaces.

wcs.> And somewhere along the lines, engineering students will *have* to
wcs.> learn Fortran, if only so they can do interesting *engineering*
wcs.> research without having to rewrite EISPAK and its hench-programs.

nevin1>   And engineers DO NOT have to learn FORTRAN!!

On which planet?  Your statement is rhetorical and doesn't have practical
value.  I think you would be hard pressed to find an engineering curriculum
which didn't require FORTRAN (with the possible exception of Computer Science,
which you may or may not count as engineering).

pcm> [ ... ]  I have just changed jobs.  In no interview was I asked
pcm> whether I knew ML; in every interview I was asked if I knew C.

I've been generally amused at the response I've gotten to this statement.
As a footnote, I'll add that I used to list ML on my resume as one of the 
programming languages I know.  I finally removed it because (1) nobody
had ever heard of it, and (2) when I explained what it was, they didn't
care.

wcs>.	but to interview at larger companies you may need the buzzword list
wcs>.	there so the personnel people will pass it on to the people you
wcs>.	really care about.)

...and smaller companies can't afford to train you how to program in C,
so you probably won't get a job there unless you know C.

nevin1> Interviewers want to know what you can do, and not so much what you
nevin1> have already done.

HUH??????????????  Now I'm convinced you're from another planet.  Companies
usually want someone to work on a specific project.  Someone with a track
record in a related area to that project will be considerably more valuable
than someone who only knows, say, functional programming languages.

nevin1> Learn the good programming habits first; then pick up all the 'fad'
nevin1> languages.  It will pay off a lot better in the end.

Now let me get this straight.  Fortran, which has been around since about
1950, and C, which is the implementation language for the defacto industry
standard operating system and about a jillion utilities and applications 
programs, these are the fad languages, right?

Incidentally, the implication here is that you can't learn 'good programming
habits' in C.  Bullshit.

Lucky for me, though, I know ML and Prolog, so if I ever have to your
planet, I'll be able to get a job ;-).

Phil Miller

robison@uiucdcsb.cs.uiuc.edu (02/23/88)

> It would be nice if a physicist (for example) could stick primarily to 
> physics and not have to learn a second discipline (computer science). 

This is becoming increasing unlikely for physicists.  The problem
is not user interfaces or computer grammar, but good algorithms.
For example, most programmers know that linked lists are often
much quicker to manipulate then arrays.  How many physicists know
what a linked list is?  They probably don't need to know the gory
details of pointers, but should know about asymptotic complexity
analysis.  (I don't intend to sound condescending to physicists.  Some
knowledge of physics is useful in CS, for example simulated annealing
and ray tracing.)

Some friends and I conjecture that a lot of supercomputer time 
is spent running poor algorithms.  Anyone have any empirical
evidence or anecdotes?

Arch D. Robison
University of Illinois at Urbana-Champaign
	
CSNET: robison@UIUC.CSNET
UUCP: {ihnp4,pur-ee,convex}!uiucdcs!robison
ARPA: robison@B.CS.UIUC.EDU (robison@UIUC.ARPA)

pcm@iwarpo3.intel.com (Phil C. Miller) (02/25/88)

In article <170500013@uiucdcsb> robison@uiucdcsb.cs.uiuc.edu writes:
>
     In a related article, I (pcm) wrote:

>> It would be nice if a physicist (for example) could stick primarily to 
>> physics and not have to learn a second discipline (computer science). 

>This is becoming increasing unlikely for physicists.  The problem
>is not user interfaces or computer grammar, but good algorithms.

   I obviously didn't make my point clear enough.  I think that computer
   science will reach such an advanced state (at some point in the future)
   that applications programmers and users will have no need for such 
   knowledge.  

   At that point, architectural and algorithmic details will hopefully
   become background details which are handled transparently.  An example
   of where this has already occurred is in the use of overlays: virtual 
   memory has made one troublesome programming detail an obsolete 
   consideration on most scientific computers.

   I am hopeful that the work on sophisticated user interfaces (like the
   Mac) will continue to (r)evolutionize computer science.  Maybe someday
   physicists will be able to specify things at a more abstract level.

Phil Miller

kurt@tc.fluke.COM (Kurt Guntheroth) (02/25/88)

> It would be nice if a physicist (for example) could stick primarily to 
> physics and not have to learn a second discipline (computer science). 

Some of the Nobel Prizes recently awarded in Economics might almost as well
have been for Computer Science.  The same is becomming true for Biology and
Physics.  Why don't they just give up and give us our own category?

nevin1@ihlpf.ATT.COM (00704a-Liber) (02/25/88)

First off, I don't know if it is worth answering a posting from someone who
puts "/dev/null" in the 'Followup-To:' field of the header such as Phil did;
people spend time and effort posting constructive followups. If you don't want
followups Phil, don't post any articles! (no smiley)


nevin1> refers to nevin1@ihlpf.UUCP (00704a-Liber,N.J.)
wcs>    refers to wcs@ho95e.UUCP (46323-Bill.Stewart,2G218,x0705,) writes)
pcm>    refers to pcm@iwarpo.UUCP (Phil C. Miller)

nevin1> 1st year students usually do not need anything more than a 
nevin1> calculator for intro courses in chem, physics, calc, psych, etc.

pcm> I think you are dating yourself.  First year science courses were
pcm> starting to make serious use of computers 15 years ago when I was a
pcm> college freshman.  As computing resources become more widely used in
pcm> universities, students will need to know more and more about
pcm> programming.

It is you who are dating yourself.  As you said yourself, you were a
freshman 15 years ago.  I was a freshman in the fall of 1983.  I asked a number
of friends around the country who are currently going for their undergrad
degree or have recently (within 1 year) graduated what programming they did
for their non-CS 100 level classes.  So far, everyone I asked gave me the same
answer:  none.  (And most of my friends are NOT CS majors!!)  Most non-CS
100 level science & math courses are interested in teaching principles,
not number crunching algorithms.

Walk into a modern college PC lab and take a look at what the freshmen are
doing.  Most of them are only doing word processing.  (I'm ignoring CAI since
that is irrelevant to this discussion.  As a side note, one summer a
friend of mine wrote the CAI program for a class he had to take two
semesters later.  Now that's the way to get an 'A' :-))

The only thing I needed as a freshman was my trusty HP-15C (I probably
didn't even need something as powerful as that but it was fun to play
with :-)).  I could bring it to lab or to a test.  Try bringing a
floppy-based PC or a terminal to a test; you'll be thrown out.  And it
would require an EXTENSIVE amount of programming on my part to be able
to have the same power and flexibility that my calculator has.  And it
would be very expensive for me to buy all the software that I would need
(as well as software integration headaches) to match my $70 calculator.

As a matter of fact, the tests were graded little on whether or not I had the
correct answer; the grading was skewed towards correct unerstanding of the
principles.  Most of my profs graded my homework and tests in the
following way:  1 point for the correct numerical answer, 9 points for
showing the work involved in getting that answer.  Most of the problems
in intro courses are one shot deals and are not really suited for
programming.  If I wrote a program I'd probably have to check the answer
with a calculator anyway (well, not me per se; ALL my programs work the
first time I type them in :-) :-)).

pcm> Incidentally, I don't happen to feel that's a good thing.  It would be
pcm> nice if a physicist (for example) could stick primarily to physics and
pcm> not have to learn a second discipline (computer science).  Hopefully,
pcm> the legacy of our generation of programmers will be a considerable
pcm> simplification in computer user interfaces.

This is about the only point I agree with you on.  I think we are going
towards this state.  More on this later.

nevin1>   And engineers DO NOT have to learn FORTRAN!!

pcm> On which planet?  Your statement is rhetorical and doesn't have practical
pcm> value.  I think you would be hard pressed to find an engineering curriculum
pcm> which didn't require FORTRAN (with the possible exception of Computer Science,
pcm> which you may or may not count as engineering).

My field of study was computer ENGINEERING (not COMPUTER SCIENCE,
although I have since seen the error of my ways :-)), which is more
similar to Electrical Engineering than CS.  There was only 1--count
'em--1 ELECTIVE which used Fortran--numerical methods.  I used to
work for an engineering consulting firm (mainly mechanical
engineers) and all the number crunching they had to do (which was very
little) was done with specific off-the-shelf packages; no one there did
any programming.  The off-the-shelf products were simply much easier to
use and hence cheaper (in terms of total time) than having to write
everything they needed in Fortran.

pcm>. [ ... ]  I have just changed jobs.  In no interview was I asked
pcm>. whether I knew ML; in every interview I was asked if I knew C.

Me neither, but I have been asked what my favorite PL was.  It's much
more interesting to talk about Icon than to talk about C (this is not a
flame against C; C is my third favorite language after Icon and C++).

nevin1> Interviewers want to know what you can do, and not so much what you
nevin1> have already done.

pcm> HUH??????????????  Now I'm convinced you're from another planet.  Companies
pcm> usually want someone to work on a specific project.  Someone with a track
pcm> record in a related area to that project will be considerably more valuable
pcm> than someone who only knows, say, functional programming languages.

STOP MISQUOTING ME!!  I never said that you should know ONLY
functional PLs.  As a matter of fact, I said that you SHOULD know a
variety of types of PLs!

I do agree that knowing a related area is a plus.  But, using an analogy,
would you hire a car mechanic because he knows how to use a
screwdriver?  A language is just a tool; nothing more.
But good programmers are much more than coders!
Most companies would rather hire someone with knowledge in
their field than someone who only knows how to code.

nevin1> Learn the good programming habits first; then pick up all the 'fad'
nevin1> languages.  It will pay off a lot better in the end.

pcm> Now let me get this straight.  Fortran, which has been around since about
pcm> 1950, and C, which is the implementation language for the defacto industry
pcm> standard operating system and about a jillion utilities and applications 
pcm> programs, these are the fad languages, right?

By 'fad' I mean currently popular; I have said nothing on whether the
language is 'good' or 'bad'.  The first 'fad' HLLs were FORTRAN and
COBOL; then came BASIC, then Pascal, and finally C.  In the future I
would suspect that C++ would be next (just a guess, though).

We have found better languages and techniques since the '50s.  Model-Ts
have been around since the 1900s, and a 'jillion' people bought them,
but you don't find too many of them on the highways anymore :-).  The
application programs are becoming simpler to use; you shouldn't have to
program a computer to use one (in much the same way that you shouln't
have to build a car to drive one).

pcm> Incidentally, the implication here is that you can't learn 'good programming
pcm> habits' in C.  Bullshit.

You can USE good programming techniques in C, but I don't think that you
will LEARN good techniques and style from just learning C.  Now, if all
C programs were required to pass through lint without any errors or
warnings, that might be a different story... :-).
-- 
 _ __			NEVIN J. LIBER	..!ihnp4!ihlpf!nevin1	(312) 510-6194
' )  )				"The secret compartment of my ring I fill
 /  / _ , __o  ____		 with an Underdog super-energy pill."
/  (_</_\/ <__/ / <_	These are solely MY opinions, not AT&T's, blah blah blah

dph@beta.UUCP (David P Huelsbeck) (02/27/88)

In article <2848@omepd> pcm@iwarpo3.UUCP (Phil C. Miller) writes:
 [...]
>
>   I obviously didn't make my point clear enough.  I think that computer
>   science will reach such an advanced state (at some point in the future)
>   that applications programmers and users will have no need for such 
>   knowledge.  
>
>   At that point, architectural and algorithmic details will hopefully
>   become background details which are handled transparently.  An example
>   of where this has already occurred is in the use of overlays: virtual 
                                                                  ^^^^^^^
>   memory has made one troublesome programming detail an obsolete 
    ^^^^^^
>   consideration on most scientific computers.
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^
 [...]
>Phil Miller


I think you're right when you say most, but ...

Crays at least do *NOT* have virtual memory.

I once tried to argue that this was a real shortcoming
with some of the hardcore coders around here. (see organization line)
Of course being that most of my experience and information
came from my udergraduate CS studies and their experience
and information came from *real* experience, I lost.

The fact is that until some sort of amazingly sophisticated
automatic analysis system is created programmers (real ones
anyway) will know much more about their progams amd their 
memeory requirements etc. than any compiler/os/runtime envirionment
will ever know.  So when real perfomance is a must there will
be no substitute for do it yourself, preprogrammed management like
overlays.  (yes, people here use overlays often for a variety of reasons)

In the past I've seen people suggest that ease of expression and understanding
should always take precidence over trying to write efficient code and
faster hardware will solve the inefficiency problem.  Well we've got
a lot of pretty fast hardware and nobody seems satisfied that it runs
fast enough that programming efficiency can be ignored.  

However, as you've said maybe some day computer science will advance to
the point where no obscure code or special hardware understanding will
be needed to write optimal code.  Let's hope so.  But I don't think
we should all start holding our breath quite yet.

	David Huelsbeck
	dph@lanl.gov
	{ihnp4,cmcl2}!lanl!dph

	#include <std/disclaimers.h>

steve@nuchat.UUCP (Steve Nuchia) (02/28/88)

From article <170500013@uiucdcsb>, by robison@uiucdcsb.cs.uiuc.edu:
> Some friends and I conjecture that a lot of supercomputer time 
> is spent running poor algorithms.  Anyone have any empirical
> evidence or anecdotes?

I think you're right....

A friend of mine (NOT a FOAF) has a half-dozen of these fancy
wall clocks with little plaques on them praising him for saving
the company (a NASA contractor, so indirecly the government)
some number of dollars.  The company's cost-reduction reward
program includes several gifts, of which the clock represents
the highest level.  Needless to say, the bean counters won't
let him have any of the lesser gifts even though he has all
the clocks he can use....

These come from walking by a bunch of programmers working on
some "trivial" program and making small changes that effect
significant changes in run time.

One example, which is fresh in my mind as I had to endure
a retelling of it last night over some suds:  A certain
trivial algorithm was to be run over some data.  It performed
an operation analogous to averaging - the details are unimportant.
In addition to being naively written with respect to numerical
issues (got the wrong answer) it had this other problem...
It took over two hours to run for something like 600,000
records...   Turns out the dataset was in some binary format
and these clowns were turning it into card images and reading
it with FORTRAN formatted input.  Recoding to read the original
binary data cut the runtime to a much more reasonable figure,
like maybe two minutes.

Since they were planning to run the program "several thousand"
times over the next few months this amounted to something
over a million dollars in cost savings...

Its sad but people who do good work are far outnumbered
by the marching morons, to the extent that management in
many industries thinks software is _supposed_ to be difficult
and expensive and not work very well.   sigh.

Oh - another anecdote - it seems this programmer had been
submitting version after version of this little program
to the cray... like for a month.  It just kept getting
the wrong answer, and he couldn't figure out why.  My friend
says "What answer is it giving you?"   "Zero."  "Hmmm, where
is it calculating it?"  "Right here."

"Well, if you change that 1 to a 1.0 it won't do that any more..."

If I had ever worked around a super computer myself I could
probably give you some first hand anecdotes - I sure have enough
from the micro and mini realm.
-- 
Steve Nuchia	    | [...] but the machine would probably be allowed no mercy.
uunet!nuchat!steve  | In other words then, if a machine is expected to be
(713) 334 6720	    | infallible, it cannot be intelligent.  - Alan Turing, 1947

oz@yunexus.UUCP (Ozan Yigit) (02/29/88)

In article <1154@zen.UUCP> vic@zen.UUCP (Victor Gavin) writes:
>
>I personally feel that the teaching of a non-procedural language in the first
>year of a computing science course is an excellent idea.
>
	My first programming language (ignoring a lousy course on
	PL/C) was APL, taught extremely well, and to this day, I
	consider that APL course to be a *turning point* both for my
	CS education and my career. [Probably lisp would have had a
	similar impact, provided that the dialect is scheme].

	Of course, after APL, learning to live (years later) with UN*X 
	and C was much easier. :-) 

oz (once-warped-by-apl)


-- 
Those who lose the sight	     Usenet: [decvax|ihnp4]!utzoo!yunexus!oz
of what is really important 	    	     ......!seismo!mnetor!yunexus!oz
are destined to become 		     Bitnet: oz@[yusol|yulibra|yuyetti]
irrelevant.	    - anon	     Phonet: +1 416 736-5257 x 3976

robison@uiucdcsb.cs.uiuc.edu (03/01/88)

The anecdote of steve@nuchat.UUCP reminds me of a bizarre
and poor set of subroutines written in FORTRAN by an engineering 
student.  The student wanted to do bit manipulations for stuff
like fast Fourier index computation.  So he wrote routines
which represented binary numbers with decimal digits.  That is
the number 9 would be converted to 1001, where 1001 means
``one thousand and one.''  He then wrote weird routines to
simulate bitwise logical operations and bit reversals of the
decimal representations.  Sort of a computational Victor/Victoria: 
a binary machine simulating a decimal machine simulating a binary machine!

Arch D. Robison
University of Illinois at Urbana-Champaign
	
CSNET: robison@UIUC.CSNET
UUCP: {ihnp4,pur-ee,convex}!uiucdcs!robison
ARPA: robison@B.CS.UIUC.EDU (robison@UIUC.ARPA)

sommar@enea.se (Erland Sommarskog) (03/01/88)

A Cunnigham (cstjc@itspna.ed.ac.uk) writes:
>>...and smaller companies can't afford to train you how to program in C,
>>so you probably won't get a job there unless you know C.
>
>But if you know HOW TO PROGRAM then it doesn't matter if you know C ( or any
>other language for that matter ). All you need is the manual and a language
>definition.

Basically right. But the people at the personal department may 
have another point of view. 
-- 
Erland Sommarskog       
ENEA Data, Stockholm        
sommar@enea.UUCP           "Souvent pour s'amuser les hommes d'equipages
                            and it's like talking to a stranger" -- H&C.

emo@iuvax.cs.indiana.edu (03/02/88)

   

To offer some more information: Lisp has been ``around'' since
the '50s as well.  In fact, I have often heard that Lisp is
second only to FORTRAN in being the oldest (high-level) language.

And, as ``oz'' mentions, Scheme can contribute to a ``turning
point'' in one's education...  The first language that I learned
was UCI Lisp running on a DEC-10 (access was via a 300 baud
phone link... and I thought that was reasonable at the time, ~'79)...
then came Pascal... then some assembler (6809 variety)... then
I learned Scheme...  C (and Unix) followed shortly.
What can be said of this chaotic learning strategy?... I would
have to say that personally, the high points were UCI Lisp
(being the first introduction to computing) and Scheme (being
the first time I really understood the ideas behind structural
design and the power of abstraction).  For several reasons
learning C was important too, but the foundations had already
been constructed upon the pillars of Lisp/Scheme.
Knowing what I do today, I would say without hesitation, that
Scheme is an excellent choice for a first language... 

In ending, I'd like to restate what someone has said previously:
programming languages are tools, a means to affect solutions
to problems in the ``real world''.  However, there is such a 
thing as ``style''... with the ease and pleasure that one
ascribes to the task of programming depending to a large extent
on one's understanding of the tool(s) being used as well as the
problem domain.  By utilizing a clean structure initially,
e.g. Scheme with its clean semantics, one is able to build a 
more stable foundation upon which to base the development and
understanding of programming methodologies, as well as the
theories behind them.

Are there any other people who have formed similar conclusions
about Scheme?

eric

;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
Eric Ost		     CSnet:   emo@indiana
Computer Science Dept.       Usenet:  emo@iuvax
Indiana University           Arpanet: emo@iuvax.cs.indiana.edu
Bloomington, Indiana 47405
(812)-335-5561
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;

shebs%defun.utah.edu.uucp@utah-cs.UUCP (Stanley T. Shebs) (03/03/88)

In article <8200001@iuvax> emo@iuvax.cs.indiana.edu writes:

>[...] I would say without hesitation, that
>Scheme is an excellent choice for a first language... 

I agree.  On the other hand, I have not observed that students
who learn Scheme first are significantly better at doing "real"
programming - their C code is just as ugly.  (This is based on
a rather limited sample.)  Has anyone been able to demonstrate
that teaching a particular language first has a real effect on
program quality after graduation?

						stan shebs
						shebs@cs.utah.edu

csrdi@its63b.ed.ac.uk (Janet: rick@uk.ac.ed) (03/04/88)

In response to Jonathan Eunice's response to Tony Cunningham's reponse to....

tjc>  CS1 courses exist to teach people about computer science. They are
tjc>  not in general service courses for other departments. 

jon>What makes you think that good programming techniques are going to be
jon>any different when found in physicists, say, than when found in
jon>computer scientists?  

I don't think Tony ever said they would be.  What he perhaps didn't make
clear was that Computer Science courses *qua* Computer Science courses
necessarily involve teaching good programming techniques. 

Another point is that physicists (for example) will require (for the
forseeable future) different programming techniques from computer
scientists. The most a computer science course can do for them is teach
them the essentials. After that, they get it from their own department.

This may reflect a difference between American and British universities;
any further thoughts about this from anyone?

>Programming is a SKILL, not a science.  

Agreed. However, like any skill, there are certain ways of doing things
which can be formalised into a 'scientific' basis. 

>You use "computer science" as though it is some higher religion that is
>defiled by teaching it to heathens, unbelievers, or people from
>other departments.

You mean it isn't? :-)

Seriously, though, you missed one of Tony's points. In this university,
we teach one first-year course for intending Computer Science graduates,
one for students of other science departments who need to be taught
computing, and one for non-scientists. That, at least, is how the three
courses (Computer Science 1A, Computer Science 1B and Information
Systems 1) are perceived. That is not to say that only intending
Computer Scientists take CS1A; I was an Arts student when I did the
course, but that's a whole 'nother posting. (Oh no not another they cry!)

Returning to the point, the contents of each course are different; in
particular the CS1A course (for intending Computer Science graduates)
contains theoretical material missing from the CS1B course. On the other
hand, CS1B covers more general issues (not in great detail) which are
missing from the CS1A course (most of which are covered in greater
detail in later years). Included in this is the existence of other
languages, like LISP and ML.

bill?>  [...] It would be nice if a physicist (for example) could stick 
bill?>  primarily to physics and not have to learn a second discipline 
bill?>  (computer science).  

Very nice. Unfortunately I don't see this happening in the too-near
future, although I feel it should be one of the goals of Computer
Science.

>Yes, but if a scientist is going to use a tool, like a computer, it
>would be a good idea to learn how to use it, to gain some skill at it.

Couldn't agree more. This is why the Computer Science department teaches
first year engineers how to program. And a bit about how the machines
work, as well.

jon>  Again, let's not confuse the art of programming with science.

H'm...does this mean "let's not call programming a science when it is
really an art", or does it mean "let's not confuse this art of
programming by bringing science into it"? (grin)

Seriously, again, there is a scientific element to certain areas of
computing, with effects in the field of programming. In particular,
things like algorithm analysis and formal methodologies can be of use to
the programmer. 

tjc>  Computer Science most certainly IS NOT engineering. If anything it's a 
tjc>  branch of mathematics.

jon>Then your computer science is different from the one I see.  The bulk
jon>of what people call computer science is people designing and building
jon>new operating systems, compilers, expert systems, hardware, languages,
jon>communication networks, and so on.  This is *engineering*...

[ the rest of Jonathon's lengthy bit about science v engineering deleted]

Ah. We have here, it seems to me, too great a distinction being drawn
between engineering and science. Engineering, in this context, seems to
me to be an application of science. This not to say that science is in
any way superior to engineering; more that they're different sides of
the same coin. A coin called technology.

Engineers require a certian amount of scientific knowledge. Chemical
engineers need to know chemistry, mechanical and electrical engineers
need to know a certain amount of physics. This material provides a
theoretical basis to the technology with which these people work.

Likewise, then, computer programmers require a certain theoretical basis
from which to work. From this we can build up a discipline which could
be called 'software engineering'. To quote from Jonathon:

jon>The bulk of CS, as it is generally talked about, is engineering --
jon>designing, working out practical problems, implementing, etc.  

I submit that this work is simplified by the existence and *use* of a
theoretical basis from which this engineering can proceed. I'm not sure,
but I think Jonathon agrees with this to some extent:

jon> There is no *science* being done in these applied areas, even
jon>though the application may further knowledge in the field.  

I think part of the overall problem is a difference in semantics. Engineers
in the States, if I recall rightly, are part of a highly respected
profession. Unfortunately in the UK, a certain snobbery still exists
which condemns engineers to being a lower form of life than scientists.
Hence, engineering is cognate with greasy hands and sweaty brows rather
than white coats and air-conditioned offices.

Back to Jonathon:
jon>The science of computers is the less-popular theory --
jon>the study of things like computability, decidability, and algorithms.

Returning briefly the issue of 'engineering vs mathematics', can I 
point out that issues such as computability and decidability arose from
work on formal systems which predates the existence of computers, and
which was largely carried out by logicians and mathematicians? It is
from this formal background that our present day notions of what is
computable are derived. In this sense, Computer Science is indeed a
branch of mathematics.

As Tony's pointed out in another article, perhaps if an interest in this
work was taken by more programmers, we would have better programs being
written. 

pcm> [ ... ]  I have just changed jobs.  In no interview was I asked
pcm> whether I knew ML; in every interview I was asked if I knew C.

bill?>  [...] I used to list ML on my resume as one of the programming languages 
bill?>  I know.  I finally removed it because (1) nobody had ever heard of it, 
bill?>  and (2) when I explained what it was, they didn't care.

tjc>   If they don't care then they ain't worth working for!

jon>Anthony is clearly not going to get a job in the US.  

Well, that's you told, Antoine! Mind you, I agree. (Fortunately, I have
no desire to work in the US.)

nevin1> Learn the good programming habits first

jon>YES YES YES YES YES 

Yes. Now we come back to the origin of the matter. As I said a few posting
back, I have found that learning other programming languages *helps*
learn good habits. First off, structure: This is enforced by languages
like Pascal and Modula, which therefore make them good ideas as teaching
languages. Languages like C , FORTRAN and IMP (another of our obscure
Edinburgh languages, but one of the most efficient OS's in the world is
written in it) do not enforce structure and therefore allow bad habits
to develop.

However, languages like Prolog and ML also have uses. The Artificial
Intelligence department uses Prolog for its undergraduate teaching. The
CS department teaches ML, partly to give a feel for other styles of
programming and partly to illustrate the some of the theoretical
material taught in the course. Having programmed in ML certainly makes
Computability Theory more understandable; it has also affected the way I
think about functions when I'm writing in procedural languages. Even if
my prospective employer has never heard of it, I am a better programmer
for the experience. Likewise, my current work with Smalltalk has led to
a change in the way I view data structures. I'm not likely to find a job
working with Smalltalk immediately I graduate, but the knowledge is
worthwhile.


Well, that's that said. How do I finish up? I think I'll just stick in my
signature and leave it at that....

	--Rick.


-- 
Janet: rick@uk.ac.ed
BITNET: rick%uk.ac.ed@UKACRL
ARPA: rick@ed.ac.uk
UUCP: rick%uk.ac.ed%ukc@mcvax

"Life would be so much easier if everyone read the manual."

nevin1@ihlpf.ATT.COM (00704a-Liber) (03/04/88)

In article <15973@beta.UUCP> dph@LANL.GOV (David P Huelsbeck) writes:
.
.Crays at least do *NOT* have virtual memory.
.
.I once tried to argue that this was a real shortcoming
.with some of the hardcore coders around here. (see organization line)
.Of course being that most of my experience and information
.came from my udergraduate CS studies and their experience
.and information came from *real* experience, I lost.

Think about *why* Crays don't have virtual memory (at least the ones doing lots
of number crunching; I don't know about the ones running Unix).  In order to
implement this, an indirect reference (to a page table) must be made on each
memory reference.  This is a non-significant overhead for a machine that is
designed to run as fast as possible.  Also, if a page is not in memory it would
sit around doing nothing waiting for the OS to put the page in memory (and
possible swap one out).  And it is a lot efficient to swap in exactly over some
code which no longer needs to be used (ie, use of overlays), than to page as
memory is needed.

.So when real perfomance is a must there will
.be no substitute for do it yourself, preprogrammed management like
.overlays.  (yes, people here use overlays often for a variety of reasons)

I hope you are not implying that whenever you need performance, use overlays!
This simply isn't true.  It is just one of the techniques used for efficient
programming.

.In the past I've seen people suggest that ease of expression and understanding
.should always take precidence over trying to write efficient code and
.faster hardware will solve the inefficiency problem.  Well we've got
.a lot of pretty fast hardware and nobody seems satisfied that it runs
.fast enough that programming efficiency can be ignored.  

That is only stressed in school.  In the 'real world', people want both
efficient AND readable code.  Basically, use slightly harder to understand
code if it will change the performance dramatically (of course this statement
is situation-dependent).
-- 
 _ __			NEVIN J. LIBER	..!ihnp4!ihlpf!nevin1	(312) 510-6194
' )  )				"The secret compartment of my ring I fill
 /  / _ , __o  ____		 with an Underdog super-energy pill."
/  (_</_\/ <__/ / <_	These are solely MY opinions, not AT&T's, blah blah blah

mike@ists (Mike Clarkson) (03/06/88)

Just a couple of really short points I'd like to make:

MIT teaches Scheme (a dialect of Lisp) for its 1'st year engineering course.
If you're not familliar with the book "Structure and Interpretation of
Computer Programs" by Abelson and Sussman (MIT Press), I recommend it very
highly.  It is incredibily beautiful from a computer "Science" point of view,
and intended for scientists and engineers at the same time.  I guess I feel
that these two cultures don't *have* to be polarized, but as someone pointed
out, good luck convincing the first year syllabus director of that...

And secondly, although FORTRAN is abysmal beyond description, people who
have never used it for scientific programming should be aware of a couple
of very simple reasons why C, Pascal etc are not viable alternatives.
1) there is no exponentiation operator defined.  Neither C or Pascal have
an equivalent to the FORTRAN ** operator. I assure you no scientist will
write much  code in a language that doesn't have this - it's too painful.
2) neither C or Pascal handle complex variables.  For work in quantum mechanics
this is important, and it's again too painful to start using records/structures
for this.


-- 
Mike Clarkson						mike@ists.UUCP
Institute for Space and Terrestrial Science
York University, North York, Ontario,
CANADA M3J 1P3						(416) 736-5611