[net.lang] Flaming Objects -- letter of resignation

munyer@harvard.ARPA (Robert Munyer) (06/28/84)

x <-- [there are still some sites with cockroaches in their kitchens]
This discussion has grown greatly both in volume and in vehemence, and
it no longer serves my original purpose (to defend to net.lang readers
the use of non-imperative languages in intro computing courses).  It
has begun to resemble the sort of thing one expects to find in
net.abortion or net.creationism.  Therefore, after dealing in this
letter with a few of Tom's points that I believe deserve to be
addressed, I would like to announce my resignation from this discussion
(battle? skirmish? limited nuclear war?).  I reserve the right to
change my mind if I read something particularly galling, or if the
discussion returns to a more reasonable level.


Paragraphs indented 0, 16, and 32 are mine.  Those indented 8 and 24 are
Tom's.  Those indented 40 are Kevin's (paraphrased).

	My apologies, first, to any who feel this inappropriate. My
	initial response was sent by mail, published without consulting
	me, and I just can't conTROL THIS INFERNAL URGE TO REPLY :-)

The ethical question involved did cross my mind.  I came to the
conclusion that there was nothing wrong with publishing the letter, by
using the (heuristic?) that in the "real" (i.e., non-computer) world,
once someone tells you hes opinion, you are free to share it with
anyone else, UNLESS he/she has requested that you keep it to yourself.
Would any net.ethics wizards out there care to comment on this?


				All of the time that the C student
				spends learning the bizarre and
				complicated syntax of C (and even
				worse, trying to debug his C programs)
				is time that he CANNOT spend
				understanding such concepts as object
				oriented programming and procedural
				abstraction.

			Complicated compared to what? If it's
			complicated, it probably isn't because of the
			syntax.

		I'm not sure I understand what you mean here.  If you'd
		ever written a C program (like the one in front of me
		now) in which you needed to declare a part of a union
		to be a pointer to a function which returns a pointer
		to that kind of union, you'd agree with me that C's
		syntax is bizarre and complicated.  You shouldn't have
		to look at the source to a compiler to figure out how
		to declare something.  My other complaints about the
		syntax include a large number of extremely un-mnemonic
		operators, and such bizarre rules as "statements must
		be terminated by semicolons (exception: unless they are
		compound statements; braces need not be followed by
		semicolons (exception: unless they are used in
		initialization, in which case they DO need to be
		followed by semicolons.))"

	You seem to be confusing the syntax of the language with the
	semantics.  Is the syntax of C really qualitatively different
	than the syntax of PASCAL (or FORTRAN, or ADA)? I think not.
	FORTH has simple syntax (which some would argue makes it more
	difficult to use).

You don't seem to have paid attention to a single word I wrote.  EVERY
ISSUE discussed in the above paragraph is syntactic, not semantic.  I
think you decided what you would write before you even read my
paragraph.  (Notice that I *am* giving you credit for at least
understanding the difference between syntax and semantics.  I *do*
believe that you understand the difference, although you certainly have
not demonstrated it.)

				Moreover, many of these concepts are
				easy to illustrate in SCHEME and
				difficult or impossible in C, simply
				because C was not designed with them in
				mind.

			There ARE other programming language concepts
			to be taught besides those you cite!

		PLEASE name some of them for me, and I will be glad to
		show you how SCHEME and other higher-level languages
		are >= C for demonstrating nearly all of them.

	Sure. LISP and COMMON LISP (I'm not absolutely sure about
	SCHEME, and by the way, you probably ought to be mentioning
	which version of SCHEME you are advocating) have a very
	primitive notion of data structures.  Certainly you can, using
	the macros mentioned above, define things corresonding to data
	structures, but they are not part of the language, per se, and
	they require user-sophistication to implement correctly.

I disagree with you completely.  C, Pascal, FORTRAN et cetera, when
compared to COMMON LISP, have a very primitive notion of data
structures.  It is more difficult and time-consuming to implement any
but the simplest data structures in these languages than it is in T or
in COMMON LISP.

	The point is not that C is ">" any higher level languages, just
	that languages have special characteristics designed in for
	specific purposes. C has its own and SCHEME has its own, and
	they aren't the same.

This is true.  I (and everyone else on the net, I'm sure) agree with
this statement, and find it so obvious that it is silly even to mention
it.  If you had PAID ATTENTION to what I wrote, you would have seen
that I said that these languages are >= C *FOR A SPECIFIC PURPOSE*.

C was designed for a specific purpose -- to have a good language to
write UNIX in.  My claim is that the special characteristics that were
given it for this purpose do NOT make it a good language for the
purpose of teaching CS students, and, more specifically, for the
purpose of *demonstrating* important computer science concepts.


				T and other higher-level,
				object-oriented languages can take
				advantage of new and important advances
				(object-oriented programming, logic
				programming, parallel computation,
				non-Von-Neumann(sp?) architectures) in
				a way that is simply impossible for C
				and other "procedural languages".

			I think you mean imperative languages. There
			are very few non-procedural languages in
			existence.

		Mea culpa, mea maxima culpa.  You have caught me in an
		error of terminology.  I DID mean imperative
		languages.  Although, of course, T (unlike C) can be
		elegantly extended to include features for
		non-procedural programming.  (Reference available on
		request).

	Yes, I would be interested in your reference. Functional and
	logic programming people, among others, have been looking at
	this for some time. "including features of non-procedural
	programming" isn't a very strong claim. LISP has proved to be a
	good base for embedded languages, but then C and PASCAL can
	also be used, so that's not too relevant. LISP can be made to
	look like a non-procedural language, but so can C and PASCAL. I
	don't understand your claim here.

But it IS relevant, and it is exactly my point.  LISP has proved to be
a good base for embedded languages, C and Pascal have not.  A language
has been embedded in LISP which effectively extends LISP to support
procedural programming.  You claim this can also be done in C and
Pascal.  Perhaps this is true, but I have never heard of it being
done.  If you have, I would like to see YOUR reference.  I personally
find the idea of non-procedural programming in C completely
preposterous.  [Before you people who write Prolog interpreters in C
start flaming at me, notice that we are discussing EMBEDDING a NPL in
C, not IMPLEMENTING one.  That means that you would be able to use an
arbitrary C function from within a Horn clause, and vice versa, in such
a way that you could think of what you are using as a single language,
not a combination of two.]


					Third: "The jobs the students
					want use more traditional
					languages (like C)."

				Granted, in 1984, bottom-level
				programming jobs (students' summer
				jobs, for example), often use C.  But
				in 10 years, or 5, or even when they
				are graduated in 3 years, this may no
				longer be true, at least in the more
				advanced (and therefore more
				interesting) jobs.  Think of how much
				the field has changed in just the last
				5 years.  I think that the
				language-independent computer science
				concepts I mentioned before will be
				much more useful to students in the
				long run than a familiarity with the
				current fad language would be.

			Utter nonsense. You obviously haven't spent
			much time in industry. It took Pascal 5-7 years
			to get into industry (and it may now have
			peaked). C is just on the upswing. If you knew
			anything about software life-cycles, you would
			happily go get your training in one or the
			other.  Or consider Ada, which is much closer
			in style to C and Pascal than to T or SCHEME.

		You obviously haven't spent much time keeping up with
		the news in the industry.  Artificial Intelligence is
		on the upswing, and virtually >>ALL<< of the important
		ideas in AI have been developed by users of LISP or
		similar languages.  [Note the word *developed*.  Users
		of other languages may use these ideas, but users of
		LISP *invented* them.] In fact, some of the important
		"advances" which Ada attempts to provide came
		originally from LISP or similar languages.

	...Fourth, users of LISP did not invent or develop AI. AI
	people invented LISP (and lots of other languages,
	incidentally).

That is entirely my point.  If you would PAY ATTENTION to what I said
in the above paragraph, you will see that I never attempted to imply
that AI sprang like some kind of technological Minerva fully formed
from the head of LISP.  What I *said* was that AI ideas were invented
by people who "just happened" to use LISP and similar languages.  They
might, just maybe, have had *good reasons* for using them.

	Fifth, ideas for ADA came from lots of different languages, and
	I think it would be lunacy to pick out LISP as somehow being
	the intellectual parent of the all.

Again, you have failed to pay attention to what I wrote.  If you look
again at your sentence above, and at the sentence to which you thought
you were replying, you will see what I mean.  [I said *SOME*, you twit.]

	Sixth, I have written an "expert system" in use in industry,
	and it certainly wasn't written in LISP. It would have been
	easier is LISP, but we'll just leave it as an exercise to the
	reader to figure it out (it was written in C, incidentally
	:-)).

Again, this is my point.  If you agree it would have been easier in
LISP, what do you have against using LISP to teach AI?  Or to teach
anything else, for that matter?  Merely because "people in industry"
need (or think they need) to use imperative languages to get the
efficiency they want, would you have us force students to do the same
when they are trying to learn the concepts involved?

		Another point, if you'll forgive a bit of snobbery:
		Here at Harvard, we like to think that our CS
		department turns out *computer scientists*, not
		coders.  We do not give students "training", but
		*instruction*.  I suspect that the "jobs in industry"
		of which you speak are mostly those of coders.  A two
		year trade school would do well to consider which
		languages are currently in most demand in this sort of
		job; a University, however, should have loftier goals.
		[For an extension of this idea, see recent article by
		M. Kenig to net.lang].

	I'll skip the obvious slur on your own your own notions of
	loftiness.  My (liberal arts) undergraduate institution didn't
	have ANY courses (for credit) in programming. They also haven't
	had any trouble getting people into (good quality) graduate
	schools in CS.

Are you trying to suggest that programming and/or AI should not be
taught to undergraduates at all?  If so, please make it more clear.  If
not, what has this statement to do with the current discussion?

			All the people from Harvard in AI that I have
	ever met came out of other departments than CS (which also has
	to do with Harvard's reticence in getting into the field).

	Once again, I don't think you know the slightest thing about
	who does what in industry, and you demonstrate that better than
	I can.

This borders on apophasis.  If I really do demonstrate my ignorance so
well, why do you feel compelled to keep telling us all about it?

	So I'd say that 1) Harvard CS has very little to boast about
	thus far, and 2) there are lots of people from other programs
	who are as good or better than what you claim you are
	producing.

Have you even *read* the other articles in this discussion?  I was not
*bragging* about Harvard's accomplishments, but discussing what (in my
opinion) its GOALS should be.  ("We like to think" was not a figure of
speech.  I *meant* it.)  For Christ's sake, what I have been doing in
all three of these letters amounts to defending *MIT*'s approach as
SUPERIOR to Harvard's.  Would you care to cast aspersions on the
ability of MIT to teach computer science?  Or to turn out computer
scientists?  Or to develop ideas which have become important to CS?  Or
... ad infinitum?

	Incidentally, I first learned of your public reply to (and
	publication of) my personal mail to you via a letter from one
	of your fellows (at Harvard, former TA of the course, and all
	that), who seems to be in considerable disagreement with your
	comments here.

Do you expect that to surprise me?  If you had paid attention to my
original article, you would have seen that the reason I wrote it in the
first place was that I disagreed with another TA's evaluation of the
"success" of Harvard's attempt to "copy" MIT's intro course.  [And, in
fact, the powers-that-be here agreed with HIM, since they returned the
course to something close to its original (10 year old) format].

			You walk in with a background in Lisp and
			they'll laugh.

		If they laugh at me, I'll feel sorry for them.  Giant
		industries have failed before because of the same
		avoidance of new ideas.  Anybody know of a good
		correspondence course in Japanese? (-;

	You'll also be unemployed.  Yes, companies have failed, and
	lots more have survived by sticking to the same old ideas.
	Ever heard of IBM?  Speaking about knowing anything, do you
	have any idea what ICOT's budget is for this year?

Yes, I have heard of IBM.  In fact, I know at least one person
[indirectly -- I have not met him in person] who was hired by IBM to
program in LISP.  He is not unemployed.  He was not laughed at.  IBM,
and perhaps also your present employer, might laugh at YOU, if they
read the tripe you have been shoveling onto the net.  Apparently, you
don't know nearly as much as you pretend to know, about IBM or "the
industry" or anything else.  You seem to be one of those particularly
unpleasant people who believe that they can make themselves appear more
intelligent [to others or to themselves] by belittling others, whether
or not they have anything really useful to say.

(By the way, if IBM really *had* "stuck to the same old ideas", they
might be making *typewriters* not computers. :-)

	Really, I don't pick the wings off of flies,

		Tom

Well, *I* wasn't the one who started the flaming.  My original response
to Kevin Crowston's article was as calm and polite a rebuttal as anyone
could ask.  Nothing I have published in this discussion [except THIS
article, of course] could reasonably be construed a flame.  Your
postings, however, have been filled with personal attacks [which have
nothing to do with net.lang] and other insults.  In many cases you have
insulted me for writing something which I never actually wrote.  I have
very little respect for your "experience" in the industry (anybody can
write an expert system), and your debating skills are extremely poor.
Since no possible benefit (to the readers of net.lang or to the state
of CSE in this country or to anything else) can result from my
continuing to defend myself from attacks that have their basis only in
your inability to pay attention, I hereby resign from this discussion
as detailed at the top of this article.

I *will* post those references I mentioned.  Much of my "personal
bibliography" was destroyed by a system crash a few months ago, so I
will have to hunt down these references from scratch.

		((name "Robert Munyer")
		 (address (uucp  "...allegra!harvard!munyer")
			  (arpa  "munyer@harvard")
			  (Snail "15 Waldo Ave #1, Somerville MA 02143")
			  (phone "(617)628-3718")))

P.S.  Before I start drawing the fire of battalions of C users, let me
say that I really don't have anything against C, for certain purposes.
I have chosen it many times myself over a number of other languages.