[comp.lang.misc] Language illiteracy

arenberg@trwrb.UUCP (Jeff Arenberg) (04/28/88)

I've been following the C vs Fortran debate for sometime now with a mixed
feeling of humor and sadness.  I think that most of the people who say one
is so much better than the other, or who try to point out features that one
has but is lacking in the other are really missing the point.  

I personally prefer C because of the way it is written.  I can think the
problem through better.  But this is just a personal preference.  I'm
reasonably sure that every program I've ever written in C (several
hundred) would have been written in Fortran also.  

The real problem are the hard-liners who aren't flexible enough to be able
to switch back and forth as necessary.  I'm having a very difficult time
at work convincing those who pay my salary, that it's OK to write code in
C.  They are engineers who learned Fortran and nothing else.  My section
head made the comment during one of our debates that, "If its not written
in uppercase, I just can't read it".  This is not a problem with his
eyesight, but with his mentality.  

There are a lot of good computer languages available (C, Fortran, Pascal,
Algol and many more) and anyone who is serious about programming should be
able to work in any of them.  Regretably, most engineers, moreover most
of the people writing any kind of software, are not serious programmers.

I think the only way to rectify this unfortunate situation, is to require
that computer language classes teach more the one at a time.  Say, showing
examples of code in atleast three different languages for every problem.
I seriously doubt this will every happen, but I can always hope.

In the meantime, I and all the other multi-lingual programmers must do our
best to cope with the stubborn and ignorant majority.

Jeff Arenberg
------------------------------------------------------------
UUCP : ( ucbvax, ihnp4, uscvax ) !trwrb!trwcsed!arenberg
ARPA : jarenberg@ecla.usc.edu
GEnie: shifty
------------------------------------------------------------

eugene@pioneer.arpa (Eugene N. Miya) (04/30/88)

Jeff makes some real points.  The joke is really on those who say different
languages are good for different things and then make artificial distinctions
between C and Fortran (I write some "scientific numeric codes" in C).  These
languages are quite similar (these people typically have not used LISP,
Prolog, APL, FP, SNOBOL, etc.)  A better example is to ask some one to
write text editors or text edit using Fortran (and a lesser extent C).
[Groddy] You would certainly rather use an editor (a tool for the job).
The problem is that people (myself when I was younger) don't see editing
as a language, drawing as a language, mouse movements as a language (now I see).

Another gross generalization from

--eugene miya, NASA Ames Research Center, eugene@ames-aurora.ARPA
				soon to be aurora.arc.nasa.gov
at the Rock of Ages Home for Retired Hackers:
  "Mailers?! HA!", "If my mail does not reach you, please accept my apology."
  {uunet,hplabs,hao,ihnp4,decwrl,allegra,tektronix}!ames!aurora!eugene
  "Send mail, avoid follow-ups.  If enough, I'll summarize."

cik@l.cc.purdue.edu (Herman Rubin) (04/30/88)

In article <8088@ames.arpa>, eugene@pioneer.arpa (Eugene N. Miya) writes:

> The problem is that people (myself when I was younger) don't see editing
> as a language, drawing as a language, mouse movements as a language (now I see).

An even bigger problem is that most people, including many mathematicians, are
unable to see that mathematics is an absolutely essential language _if you are
considering non-routine situations_.  
-- 
Herman Rubin, Dept. of Statistics, Purdue Univ., West Lafayette IN47907
Phone: (317)494-6054
hrubin@l.cc.purdue.edu (ARPA or UUCP) or hrubin@purccvm.bitnet

mls@whutt.UUCP (SIEMON) (05/02/88)

In article <765@l.cc.purdue.edu>, cik@l.cc.purdue.edu (Herman Rubin) writes:
> 
> An even bigger problem is that most people, including many mathematicians, are
> unable to see that mathematics is an absolutely essential language _if you are
> considering non-routine situations_.  

Herman,
	The whole trouble with your position in these debates is shown here.
Mathematics is a huge congeries of "languages" with practically every user
effectively inventing an idiolect.  Much of the flexibility of mathematical
usage is the freedom to change your whole syntax & semantics -- sometimes
even in the same paper!  If you tried to comprehend in ONE formal system*
the notation of differential geometry, algebraic varieties (schemes, say),
C*-algebras and (say) measure theory, NOBODY would be able to follow ANY
argument.  Specialization of language is essential, even in the informality
of mathematical work.
------
*	Note: Bourbaki in effect tried to create a uniform mathematical
style, even across different "languages".  The success was at best partial.
-- 
Michael L. Siemon
contracted to AT&T Bell Laboratories
ihnp4!mhuxu!mls
standard disclaimer

jonson@unicom.UUCP (Mary D Johnson) (05/03/88)

I am in agreement on the desirability of knowing more than one
language. Pity the poor person (me, sometimes) who uses, teaches,
and consults on a Mac, IBM-PC and friends, and a VAX, in Pascal,
Basic, Fortran or Prolog (not to mention various applications) in
any one day--and I am considered a traitor (!?) in all those worlds.

Remember, when those Fortran only types bug you--in the land of the
blind, the one-eyed man is king.

Mary

wew@naucse.UUCP (Bill Wilson) (05/04/88)

I can agree with Jeff Arenberg on his position.  I also believe
that there is no "PERFECT" language and that to do things
efficiently one may have to program in more than one language
(ever hear of mixed language programming?).  I can easily program
in C, FORTRAN, PASCAL and a number of other langauges.  I pick
the right one for the job and also take into account what code
may already have been written so that I do not have to re-invent    
the wheel.

Let's be reasonable.  When the perfect language arrives, we can 
all jump on the band wagon.  In the meantime, why don't we
discuss something that may lead to more productivity.

One last question, why are some of the main data managers for
MS-DOS being translated from Fortran to C?       

Also, I am writing a stat package in C and my results are the same
as SPSS, SYSTAT and SAS.  Since they are mostly written in Fortran 
does that suggest anything?


-- 
Bill Wilson
Northern AZ Univ
Flagstaff, AZ 86011
{These views are mine and do not necessarily reflect those of my employer}

nevin1@ihlpf.ATT.COM (00704a-Liber) (05/06/88)

In article <787@trwcsed.trwrb.UUCP> arenberg@trwcsed.UUCP (Jeff Arenberg) writes:
>There are a lot of good computer languages available (C, Fortran, Pascal,
>Algol and many more) and anyone who is serious about programming should be
>able to work in any of them.  Regretably, most engineers, moreover most
>of the people writing any kind of software, are not serious programmers.

Algol is readily available?? :-)   I feel that there is a distinction
between able to write code in a given language and *programming* in a given
language.  For example:  when a Pascal programmer first learns C, he/she
tends to write C code that looks like Pascal (some even go to the extreme
of doing #define BEGIN {, etc.).  Each different language has a paradigm
that goes with it, and in order to be a good *programmer* (vs. coder) in a
given language one must also use the paradigm.

Unfortunately, it is a lot harder to remember the paradigm than the syntax
of a language (and unless you look at some good existing code, you
probably won't learn the paradigm).  Just translating FORTRAN into C is not
good enough.

>I think the only way to rectify this unfortunate situation, is to require
>that computer language classes teach more the one at a time.  Say, showing
>examples of code in atleast three different languages for every problem.
>I seriously doubt this will every happen, but I can always hope.

The languages you mentioned are not all that different; they are all von
Neumann type languages.  By programming in, say C, FORTRAN, and Pascal you
really don't learn new ways of approaching a problem.  Using C, LISP, and
Smalltalk for the same problem might be a better use of time.
-- 
 _ __			NEVIN J. LIBER	..!ihnp4!ihlpf!nevin1	(312) 510-6194
' )  )				"The secret compartment of my ring I fill
 /  / _ , __o  ____		 with an Underdog super-energy pill."
/  (_</_\/ <__/ / <_	These are solely MY opinions, not AT&T's, blah blah blah

nevin1@ihlpf.ATT.COM (00704a-Liber) (05/06/88)

In article <8088@ames.arpa> eugene@pioneer.UUCP (Eugene N. Miya) writes:
>A better example is to ask some one to
>write text editors or text edit using Fortran (and a lesser extent C).

I don't think you could pay me enough to do this in FORTRAN!! :-)

>[Groddy] You would certainly rather use an editor (a tool for the job).

And writing a text editor in FORTRAN instead of C is using the wrong tool
for the job (in my humble opinion, of course)!

>The problem is that people (myself when I was younger) don't see editing
>as a language, drawing as a language, mouse movements as a language (now I see).

These are not (or should not) be languages (in the sense that you have to
program in them).  Stuff like mouse movements are better characterized by
making them a class in an object-oriented language.  They are tools that
should be at your disposal, not a language in and of itself.
-- 
 _ __			NEVIN J. LIBER	..!ihnp4!ihlpf!nevin1	(312) 510-6194
' )  )				"The secret compartment of my ring I fill
 /  / _ , __o  ____		 with an Underdog super-energy pill."
/  (_</_\/ <__/ / <_	These are solely MY opinions, not AT&T's, blah blah blah

guido@cwi.nl (Guido van Rossum) (05/06/88)

In article <4654@ihlpf.ATT.COM> nevin1@ihlpf.UUCP (00704a-Liber,N.J.) writes:
>In article <8088@ames.arpa> eugene@pioneer.UUCP (Eugene N. Miya) writes:
>>A better example is to ask some one to
>>write text editors or text edit using Fortran (and a lesser extent C).

Oh well, I once wrote a half-decent screen editor in COBOL!
--
Guido van Rossum, Centre for Mathematics and Computer Science (CWI), Amsterdam
guido@piring.cwi.nl or mcvax!piring!guido or guido%piring.cwi.nl@uunet.uu.net

eugene@pioneer.arpa (Eugene N. Miya) (05/06/88)

First, I wish to agree with Nevin's comments to Rubin.  I'm having the same
series of arguments with physicists and mathematicians at Ames, LLNL, and
SRC.  The best "evidence" they have given me is a chapter on notation (variables)
by Whitehead and comments about Feynman diagrams.  The problem which was
noted in a comp.arch (Steve Stevenson, he asked forgiveness in teaching)
posting was the fundamental problem of SYNTAX confliciting with the
unseen SEMANTICS.  In large part, this is one problem.  (See Fred Brooks'
Silver Bullet paper).  Conjecture: multi-natural lingual people seen to
understand this better.

In article <4654@ihlpf.ATT.COM> nevin1@ihlpf.UUCP (00704a-Liber,N.J.) writes:
>>as a language, drawing as a language, mouse movements as a language (now I see).
>
>These are not (or should not) be languages (in the sense that you have to
>program in them).  Stuff like mouse movements are better characterized by
>making them a class in an object-oriented language.  They are tools that
>should be at your disposal, not a language in and of itself.

I beg to differ.  I once believed as you do here, and perhaps in the future I
will revert, but for now I cite to you Sutherland's 66 paper which I also posted to
comp.arch [Computer Graphics -- Ten Unsolved Problems, Datamation, May 1966
[when it was a respectable publication]].

Another gross generalization from

--eugene miya, NASA Ames Research Center, eugene@ames-aurora.ARPA
	resident cynic			soon to be aurora.arc.nasa.gov
at the Rock of Ages Home for Retired Hackers:
  "Mailers?! HA!", "If my mail does not reach you, please accept my apology."
  {uunet,hplabs,hao,ihnp4,decwrl,allegra,tektronix}!ames!aurora!eugene
  "Send mail, avoid follow-ups.  If enough, I'll summarize."

markv@uoregon.uoregon.edu (Mark VandeWettering) (05/07/88)

>In article <765@l.cc.purdue.edu>, cik@l.cc.purdue.edu (Herman Rubin) writes:
> 
> An even bigger problem is that most people, including many mathematicians, are
> unable to see that mathematics is an absolutely essential language _if you are
> considering non-routine situations_.  

	I thought I would leap into the fray and get my bytes in for the
	week.

	First of all, I found your statement above to be meaningless.
	Define "non-routine situations" for me please.  I wouldn't say
	that mathematics is _essential_ to any activity.

	While I am a strong believer in functional proramming, the one
	area of programming which does borrow heavily from mathematics,
	I also believe that languages should be practical.  While FP, ML
	and SASL are neat to play with, I program in C and Lisp. 

	The entire concept of a *perfect* language is silly.  There is
	no such animal.  Your claims that mathematics makes a good
	language is equally silly.  90% of the time I spend reading
	research papers is spent decoding some mathematicians pet
	notation.  Mathematics suffers from exactly the same problems as
	programming languages: ideas get muddled in notation.

	Of course the ultimate in silly is the fact that you claim that
	you can design a better language, but then decline to give us a
	proof by example.

Enough.

mark vandewettering

nather@ut-sally.UUCP (Ed Nather) (05/08/88)

In article <1940@uoregon.uoregon.edu>, markv@uoregon.uoregon.edu (Mark VandeWettering) writes:
> 	Mathematics suffers from exactly the same problems as
> 	programming languages: ideas get muddled in notation.

It's much worse than that.  The basic notation -- and therefore the thought
processes it fosters -- describes a system of "eternal truth", usually
shown by the equals sign ( = ).  It not only says stuff on each side is
equivalent; it implies it always has been, and always will be.  Whatever
process change is needed must be artificially imposed from outside.

At one time mathematicians honestly believed they were manipulating, and
discovering, basic truths about the universe. "God is a mathematician."
Truth was eternal (like the universe itself).

Unfortunately, Goedel smashed the first idea, Hubble the second.  Change
is far more universal than stasis.  Turing was, I think, the first to
realize the enormous power of the simple concept embedded in

    i <- i + 1

which is, confusingly, often written as 

    i = i + 1

but when it is, at least it negates the idea that truth cannot be changed.

-- 
Ed Nather
Astronomy Dept, U of Texas @ Austin
{allegra,ihnp4}!{noao,ut-sally}!utastro!nather
nather@astro.AS.UTEXAS.EDU

debray@arizona.edu (Saumya Debray) (05/08/88)

In article <1940@uoregon.uoregon.edu>, Mark VandeWettering writes:
> 	Mathematics suffers from exactly the same problems as
> 	programming languages: ideas get muddled in notation.

This is silly!  Mathematical formalisms provide you with tools to define
and reason about your ideas in a precise and unambiguous manner.  If
someone can't use these tools effectively, the problem is with him, not
with mathematics.  Just because I can write unintelligible code in Lisp
or Prolog doesn't make them poor languages; just because I can flatten
my thumb with a hammer doesn't make the hammer a bad tool.

In article <11526@ut-sally.UUCP>, nather@ut-sally.UUCP (Ed Nather) writes:
> It's much worse than that.  The basic notation -- and therefore the thought
> processes it fosters -- describes a system of "eternal truth", usually
> shown by the equals sign ( = ).  It not only says stuff on each side is
> equivalent; it implies it always has been, and always will be.  Whatever
> process change is needed must be artificially imposed from outside.

That depends on the kind of system you're working with.  First order
predicate logic won't let you reason (directly) about change, but try the
various temporal, modal and dynamic logics that are around.

-- 
Saumya Debray		CS Department, University of Arizona, Tucson

     internet:   debray@arizona.edu
     uucp:       {allegra, cmcl2, ihnp4} !arizona!debray

markv@uoregon.uoregon.edu (Mark VandeWettering) (05/09/88)

In article <5400@megaron.arizona.edu> debray@arizona.edu (Saumya Debray) writes:
>In article <1940@uoregon.uoregon.edu>, Mark VandeWettering writes:
>> 	Mathematics suffers from exactly the same problems as
>> 	programming languages: ideas get muddled in notation.

>This is silly!  Mathematical formalisms provide you with tools to define
>and reason about your ideas in a precise and unambiguous manner.  

	Perhaps I was overly terse in my answer (which ought to be
	considered a virtue, but what the heck).  

	I agree, mathematical formalisms (notation) provide you with
	tools to concisely express ideas unabiguously.  Programming
	languages serve precisely the same purpose in the world of
	computer programming.  A program is a description of a task,
	written (hopefully) unabiguously.

	Now, the question is:  Is mathematics a good notation for
	describing problems that are typical in computer science?  

	My answer is: no.  

>If someone can't use these tools effectively, the problem is with him, not
>with mathematics.  

	My point is:  the majority of tasks cannot be expressed within a
	strict mathematical framwork.  Try to describe the actions of a
	modern operating system in terms of ANY formalism.  A feat which
	I am sure most will agree is beyond doing.  And the problem
	remains:  Now that I have described this operating system, can I
	actually convert this description into runnable object code for
	some machine.

>Just because I can write unintelligible code in Lisp
>or Prolog doesn't make them poor languages; just because I can flatten
>my thumb with a hammer doesn't make the hammer a bad tool.

	But, a hammer is used for hammering, and not ballet dancing. 

>In article <11526@ut-sally.UUCP>, nather@ut-sally.UUCP (Ed Nather) writes:
>> It's much worse than that.  The basic notation -- and therefore the thought
>> processes it fosters -- describes a system of "eternal truth", usually
>> shown by the equals sign ( = ).  It not only says stuff on each side is
>> equivalent; it implies it always has been, and always will be.  Whatever
>> process change is needed must be artificially imposed from outside.
>
>That depends on the kind of system you're working with.  First order
>predicate logic won't let you reason (directly) about change, but try the
>various temporal, modal and dynamic logics that are around.

	Again, if it can be translated into some sort of executable code
	for a machine, then it probably can be used as a programming
	language.  That doesn't guarantee that it is good at expressing
	tasks in a given problem domain.

	The level of depth of postings in this area (while not
	particularly Saumya Debray) have been very low, indicating
	people who don't have significant depth in the areas of
	compilation and programming languages.  

	I am arguing, not against formalism, or formal methods at all.
	I am arguing that traditional mathematical notations (such as
	predicate logic) are probably inappropriate forms to express
	tasks to a computer.

>Saumya Debray		CS Department, University of Arizona, Tucson
>
>     internet:   debray@arizona.edu
>     uucp:       {allegra, cmcl2, ihnp4} !arizona!debray

mark vandewettering

fpst@hubcap.UUCP (Steve Stevenson) (05/09/88)

Ya'll are missing a fundamental point.  There are any number of anecdotes
to point out that language can be misleading - if not downright deadly.
Else propaganda will not work.  MORAL: beware relying on unevaluated language.

Up until *Grundlagen der Arithmetik* there was not the uniformity of
concept of language.  *Principia* tried to rectify this - giving rise
to logical positivism, with which we are stuck today. Note that the 
prime movers of positivism - Whitehead, Russel and Wittgenstien - all
gave up on it.  The strict separation of syntax and semantics is an artifact
of Noam Chomsky's view of linguistics.  Remember, Aristotle and all
logicians to Frege were interested in debate - a much different problem
than axiomatic deductive theories.

For those of you who are in love with things mathematical:
in my experience, mathematicians are the hardest people to teach programming.
They have not concept of evaluation. [For the record, I'm
a mathematician by training].  Another difficult
problem is that cultural imprinting is what leads to clarity and semantics
in most mathematics.  Besides, there are many idiomatic uses of notation
which are understood but not technically correct.

{Begin Heresy
	Mathematical notation and content evolved in support of the sciences.
	Please recall that Gauss - the Prince of Mathematics -was what
	we would call an astrophysicst today.  If you want to emulate
	mathematics, then support your equivalent of "the sciences."
End Heresy}-- 
Steve Stevenson                            fpst@hubcap.clemson.edu
(aka D. E. Stevenson),                     fpst@clemson.csnet
Department of Computer Science,            comp.parallel
Clemson University, Clemson, SC 29634-1906 (803)656-5880.mabell

nather@ut-sally.UUCP (Ed Nather) (05/09/88)

In article <5400@megaron.arizona.edu>, debray@arizona.edu (Saumya Debray) writes:
> In article <1940@uoregon.uoregon.edu>, Mark VandeWettering writes:
> > 	Mathematics suffers from exactly the same problems as
> > 	programming languages: ideas get muddled in notation.
> 
> In article <11526@ut-sally.UUCP>, nather@ut-sally.UUCP (Ed Nather) writes:
> > It's much worse than that.  The basic notation -- and therefore the thought
> > processes it fosters -- describes a system of "eternal truth", usually
> > shown by the equals sign ( = ).  It not only says stuff on each side is
> > equivalent; it implies it always has been, and always will be.  Whatever
> > process change is needed must be artificially imposed from outside.
> 
> That depends on the kind of system you're working with.  First order
> predicate logic won't let you reason (directly) about change, but try the
> various temporal, modal and dynamic logics that are around.
> 

Thank you for making my point so clearly.  The original discussion concerned
the use of mathematics as a programming language, pro and con, not logics
that use mathematics as a basis.  The original Fortran, for example, tried to 
look as much like formal mathematics as possible, but had to introduce many new
"non-mathematical" concepts and operations in order to be a useful programming
language. I'm sure that mathematics would have been used then, had it been
considered suitable.








-- 
Ed Nather
Astronomy Dept, U of Texas @ Austin
{allegra,ihnp4}!{noao,ut-sally}!utastro!nather
nather@astro.AS.UTEXAS.EDU

kelly@uxe.cso.uiuc.edu (05/09/88)

/* Written  1:57 pm  May  5, 1988 by nevin1@ihlpf.ATT.COM in uxe.cso.uiuc.edu:comp.lang.misc */
>Algol is readily available?? :-)   I feel that there is a distinction
>between able to write code in a given language and *programming* in a given
>language.  For example:  when a Pascal programmer first learns C, he/she
>tends to write C code that looks like Pascal (some even go to the extreme
>of doing #define BEGIN {, etc.).  Each different language has a paradigm
>that goes with it, and in order to be a good *programmer* (vs. coder) in a
>given language one must also use the paradigm.

Surely, that is the whole point of learning a new programming language.
If learning a new language doesn't give you a new paradigm for programming,
I don't think you should bother learning it.  Part of teaching the language
has to be delivering the appropriate paradigm.

>>I think the only way to rectify this unfortunate situation, is to require
>>that computer language classes teach more the one at a time.  Say, showing
>>examples of code in atleast three different languages for every problem.
>>I seriously doubt this will every happen, but I can always hope.

>The languages you mentioned are not all that different; they are all von
>Neumann type languages.  By programming in, say C, FORTRAN, and Pascal you
>really don't learn new ways of approaching a problem.  Using C, LISP, and
>Smalltalk for the same problem might be a better use of time.

A new paradigm is a new way of approaching a problem, isn't it?
If LISP and smalltalk give me a whole new way of programming, I'd love to
learn them.  What do they offer an engineer or scientist doing numerical
programming?

kend@tekchips.TEK.COM (Ken Dickey) (05/17/88)

In article <51300008@uxe.cso.uiuc.edu> kelly@uxe.cso.uiuc.edu writes:
>A new paradigm is a new way of approaching a problem, isn't it?
>If LISP and smalltalk give me a whole new way of programming, I'd love to
>learn them.  What do they offer an engineer or scientist doing numerical
>programming?

You might be interested in the "Abstraction in Numerical Methods"
Tutorial to be given by Gerald Sussman and Mathiew Halfant at the
upcoming Snowbird conference {1988 Lisp and Functional Programming
Conference, 1988 July 25-27}.  They will probably be using a 
Lisp/Algol dialect called Scheme, which supports a wide variety
of programming styles, including functional/applicative and object
oriented.  Also, there is a excellent text, "The Structure and 
Interpretation of Computer Programs" {Abelson & Sussman, MIT, 1985}
which contains a fair number of numerical examples in various 
contexts.  Additionally, you might note the large amount of work 
on computer algebra systems done in various Lisps.

-Ken Dickey
---------------------------------------------------
CSnet  : kend%tekchips.tek.com@relay.cs.net
ARPAnet: kend%tekchips%tektronix@csnet-relay
UUCP   : kend%tekchips@tektronix.TEK.COM
---------------------------------------------------