[comp.sys.apple2] How to Choose a Programming Language

rhyde@ucrmath.ucr.edu (randy hyde) (05/09/91)

Here's a cute little ditty I swiped off BIX:

Which language is right for you?

In order to help you make a competent, uncomplicated choice concerning the
competition between complex, incompatible computer compilers, we have
composed this complete, compact, composite compendium comprising comparisons
to compensate for the complaints and complements of their compromises. We
hope you will find it comprehensible rather than compost.

Assembler:
        You shoot yourself in the foot.

Ada:
        The Department of Defense shoots you in the foot after offering you
.More..
        a blindfold and a last cigarette.

APL:
        GN </ FT ^ BLT

BASIC (interpreted):
        You shoot yourself in the foot with a water pistol until your leg
        is waterlogged and falls off.

BASIC (compiled):
        You shoot yourself in the foot with a BB using a SCUD missile
        launcher.

C++:
        You create a dozen instances of yourself and shoot them all in the
        foot. Not knowing which feet are virtual, medical care is impossible.

COBOL:
        USE HANDGUN.COLT(45), AIM AT LEG.FOOT, THEN WITH ARM.HAND.FINGER
        ON HANDGUN.COLT(TRIGGER) PERFORM SQUEEZE, RETURN HANDGUN.COLT TO
.More..
        HIP.HOLSTER.

csh:
        After searching the manual until your foot fall asleep, you shoot
        the computer and switch to C.

dBase:
        You buy a gun. Bullets are only available from another company and
        are promised to work so you buy them.  Then you find out that the
        next version of the gun is the one that is scheduled to actually
        shoot bullets.

FORTRAN:
        You shoot yourself in each toe, iteratively, until you run out of
        toes. You shoot the sixth bullet anyway since no exception-processing
        was anticipated.

Modula-2:
        You perform a shooting on what might currently be a foot with what
        might currently be a bullet shot by what might currently be a gun.
.More..

Pascal:
        Same as Modula-2, except the bullet is not of the right type for the
        gun and your hand is blown off.

PL/I:
        After consuming all system resources including bullets, the data
        processing department doubles its size, acquires two new mainframes
        and drops the original on your foot.

Smalltalk, Actor:
         After playing with the graphics for three weeks the programming
        manager shoots you in the head.

Snobol:
        Grab your foot with your hand and rewrite your hand to be a bullet.

 Whoops. This is almst a long.message...

**********************************

Some more--

C: You shoot yourself in the foot and then no one else can figure out what you
   did.

Pascal: You try to shoot yourself in the foot, but it tells you that your foot
        is the wrong type and out of range to boot!

ORCA/C: Byteworks keeps promising to supply good ammunition RSN!

Anyone claiming one language is better than another: You shoot yourself in the
foot and blow off your head because of where you had put your foot.

*** Randy Hyde

philip@utstat.uucp (Philip McDunnough) (05/10/91)

In article <14309@ucrmath.ucr.edu> rhyde@ucrmath.ucr.edu (randy hyde) writes:
>Here's a cute little ditty I swiped off BIX:

[ goes through many languages, cute comments...]

>APL:
>        GN </ FT ^ BLT

I grew up on APL( and Fortran). The gripe against its character set typically
comes from the same sorts of people who complain about abstract mathematics.
It seems to me that mathematics is a prerequisite for most good work of
any substance in most sciences. So the issue of not being able to deal with
symbols is very distressing. It has forced some of the APL community to
adopt an ASCII character set, and recently Iverson has come out with his
J language. I like APL, and would really like to see it on the GS. 
 
Every era seems to have its in language. C is the current one( and one that
I find very unappealing). There was PL/1 at one point. It seems that every
generation is told to learn this or else. People would be much better of
learning how various cpu's work, how mathematics works, etc...After the
concepts are established, they can get down to the passing fads.

Philip McDunnough
University of Toronto
[my opinions,...]

rhyde@sisler.ucr.edu (randy hyde) (05/10/91)

I use APL on the PC every now and then.  NOT ASCII.  A real APL character set
using the graphics display.  We can all still poke fun at the APL character
set though....   Actually, I think the line was poking more fun at the cryptic
programs APL programmers *typically* write than the character set!

toddpw@nntp-server.caltech.edu (Todd P. Whitesel) (05/10/91)

rhyde@sisler.ucr.edu (randy hyde) writes:

>I use APL on the PC every now and then.  NOT ASCII.  A real APL character set
>using the graphics display.  We can all still poke fun at the APL character
>set though....   Actually, I think the line was poking more fun at the cryptic
>programs APL programmers *typically* write than the character set!

I'll heartily agree with this. We went over APL in programming paradigms class
last week, the prof wanted to get it over with quickly. He gave a brief
overview of APL and some of the operations, and I thought of this:

	It's been said that C programs look like random punctuation.
	Well, APL programs ARE random punctuation.

The prof also gave as an example an APL program that found the first N primes
-- it was about 20 _characters_ long and he told us it'd taken him about 75
minutes to convince himself that that was in fact what the program did...

Todd Whitesel
toddpw @ tybalt.caltech.edu

gwyn@smoke.brl.mil (Doug Gwyn) (05/11/91)

In article <1991May9.231820.26867@utstat.uucp> philip@utstat.uucp (Philip McDunnough) writes:
>So the issue of not being able to deal with symbols is very distressing.
>It has forced some of the APL community to adopt an ASCII character set, ...

It had nothing to do with the ability of PEOPLE to deal with symbols,
but rather with the unavailability of Iverson's funny symbols in most
computer character code sets.

>Every era seems to have its in language. C is the current one( and one that
>I find very unappealing). There was PL/1 at one point. It seems that every
>generation is told to learn this or else. People would be much better of
>learning how various cpu's work, how mathematics works, etc...After the
>concepts are established, they can get down to the passing fads.

I think every professional programmer should be able to READ well-written
programs in Fortran, Algol, Lisp, Pascal, C, and Ada, since one is likely
to encounter all of these in published articles of general interest to
programmers in any language.  You should of course be quite proficient in
whatever language you actually use in your work.

There is not much need for understanding CPU architectures for most
programming, and it can even impair one's ability to code portably.

Several lifetimes would not be enough to completely master all of known
mathematics.  I don't think that should be a prerequisite for computing.
Presumably, you had in mind a certain level of mathematical understanding.
It should be noted that the ACM has had committees study the whole area
of computer science education, and their recommendations have been
published.  Anyone who really cares about this should probably look it up.

gwyn@smoke.brl.mil (Doug Gwyn) (05/11/91)

In article <1991May10.083024.15828@nntp-server.caltech.edu> toddpw@nntp-server.caltech.edu (Todd P. Whitesel) writes:
>We went over APL in programming paradigms class last week, the prof wanted to
>get it over with quickly. ...

It's too bad you got a biased presentation of the language.
In actuality, one can code intelligibly in APL, and it has great
expressive power.  I've seen some cryptographic applications that
would take hundreds or thousands of times the effort to implement
in Algol-like languages.  It's handy for interactive exploration
of data, and indeed the "S" language is somewhat inspired by APL.

toddpw@nntp-server.caltech.edu (Todd P. Whitesel) (05/11/91)

gwyn@smoke.brl.mil (Doug Gwyn) writes:

>It's too bad you got a biased presentation of the language.
>In actuality, one can code intelligibly in APL, and it has great
>expressive power.

I understand that -- we saw how compactly APL can represent matrix operations
and it seems that APL is actually a lot like unix; once you understand the
seemingly cryptic shorthand you can type in rather powerful commands, programs,
and such almost as fast as you can think them through.

My comments were intended in the humorous manner of the previous post.

Todd Whitesel
toddpw @ tybalt.caltech.edu

philip@utstat.uucp (Philip McDunnough) (05/12/91)

In article <16117@smoke.brl.mil> gwyn@smoke.brl.mil (Doug Gwyn) writes:
>In article <1991May9.231820.26867@utstat.uucp> philip@utstat.uucp (Philip McDunnough) writes:
>>So the issue of not being able to deal with symbols is very distressing.
>>It has forced some of the APL community to adopt an ASCII character set, ...

>It had nothing to do with the ability of PEOPLE to deal with symbols,
>but rather with the unavailability of Iverson's funny symbols in most
>computer character code sets.

The "funny symbols" have been available for virtually every computer for a
long time. The main complaint was not the unavailability( since there were
even ascii equivalents before Iverson came out with J) but the so-called
unreadability of APL code, and the difficulty people had in maintaining
other people's programs. This difficulty can be traced to most programmers'
poor grasp of the concepts of mathematics and their inability to think in
abstract ways.
 
Philip McDunnough
University of Toronto
philip@utstat.utoronto.ca
[my opinions,...]

gwyn@smoke.brl.mil (Doug Gwyn) (05/12/91)

In article <1991May11.230407.1038@utstat.uucp> philip@utstat.uucp (Philip McDunnough) writes:
>The "funny symbols" have been available for virtually every computer for a
>long time.

That's certainly not true.  None of the usual code sets (ASCII, EBCDIC, etc.)
include the APL symbols, and even most typesetting fonts used with laser
printers do not include them.  There have been occasional special-order
computer terminals such as the Tektronix 4015 with an APL character set,
but most terminals have not provided APL symbols.

>The main complaint was not the unavailability( since there were
>even ascii equivalents before Iverson came out with J) but the so-called
>unreadability of APL code, and the difficulty people had in maintaining
>other people's programs. This difficulty can be traced to most programmers'
>poor grasp of the concepts of mathematics and their inability to think in
>abstract ways.

In my experience the real difficulty lay in the abysmal lack of
professionalism on the part of APL coders, who would ignore software
engineering issues such as the need to anticipate maintenance
requirements.  It was a rare APL program that was adequately commented,
let alone designed in any greater depth for maintainability.

The same flaw can occur in improper use of almost any programming
language.  I've seen TECO macros that make APL programs look like
Basic English.  And of course there is the infamous Obfuscated C Code
contest, whose winning entries are almost incomprehensible.  None of
these reflect on the language, but rather on the programmer.

I don't know how you could have information about "most" programmers.
Do you really know that many people?  The best programmers I know of
certainly think at a high level of abstraction.

philip@utstat.uucp (Philip McDunnough) (05/13/91)

In article <16133@smoke.brl.mil> gwyn@smoke.brl.mil (Doug Gwyn) writes:
>In article <1991May11.230407.1038@utstat.uucp> philip@utstat.uucp (Philip McDunnough) writes:
>>The "funny symbols" have been available for virtually every computer for a
>>long time.
>
>That's certainly not true.  None of the usual code sets (ASCII, EBCDIC, etc.)
>include the APL symbols, and even most typesetting fonts used with laser
>printers do not include them.  There have been occasional special-order
>computer terminals such as the Tektronix 4015 with an APL character set,
>but most terminals have not provided APL symbols.

While it's true that most terminals have not had support for the APL
character set, there have been several suppliers of APL terminals. As far
as laser printers using them, both STSC and Spencer(APL6800) support
printing APL code. Possibly you are using 20 years as a benchmark for a
long time.


>>The main complaint was not the unavailability( since there were
>>even ascii equivalents before Iverson came out with J) but the so-called
>>unreadability of APL code, and the difficulty people had in maintaining
>>other people's programs. This difficulty can be traced to most programmers'
>>poor grasp of the concepts of mathematics and their inability to think in
>>abstract ways.
>
>In my experience the real difficulty lay in the abysmal lack of
>professionalism on the part of APL coders, who would ignore software
>engineering issues such as the need to anticipate maintenance
>requirements.  It was a rare APL program that was adequately commented,
>let alone designed in any greater depth for maintainability.

That's true in many, but not all, cases. It's not only APL programmers
who suffer from lack of foresight.

>The same flaw can occur in improper use of almost any programming
>language.  I've seen TECO macros that make APL programs look like
>Basic English.  And of course there is the infamous Obfuscated C Code
>contest, whose winning entries are almost incomprehensible.  None of
>these reflect on the language, but rather on the programmer.

>I don't know how you could have information about "most" programmers.
>Do you really know that many people?  The best programmers I know of
>certainly think at a high level of abstraction.

I work in an 80,000 student university and have been doing it for 14 years.
I've seen a lot of programmers pass by. Have a look at the mathematics'
component of your typical undergraduate honour's degree in Computer Science
and you will see what I mean. Of course there are exceptions. There are
not that many first rate programmers. The best you refer to constitute a
some proportion of people programming for a living.

Philip McDunnough
University of Toronto
philip@utstat.utoronto.ca
[my opinions,...]

rhyde@ucrmath.ucr.edu (randy hyde) (05/13/91)

>>>>
There is not much need for understanding CPU architectures for most
programming, and it can even impair one's ability to code portably.
<<<<<

You should read what Donald Knuth had to say about this in vol I of ACP.

rhyde@ucrmath.ucr.edu (randy hyde) (05/13/91)

>>>
>We went over APL in programming paradigms class last week, the prof wanted to
>get it over with quickly. ...

It's too bad you got a biased presentation of the language.
In actuality, one can code intelligibly in APL, and it has great
expressive power. 
<<<<

That's how I feel about many people's attitudes towards assembly language
around here!
*** Randy Hyde

rhyde@ucrmath.ucr.edu (randy hyde) (05/13/91)

The "funny symbols" have been available for virtually every computer for a
long time. The main complaint was not the unavailability( since there were
even ascii equivalents before Iverson came out with J) but the so-called
unreadability of APL code, and the difficulty people had in maintaining
other people's programs. This difficulty can be traced to most programmers'
poor grasp of the concepts of mathematics and their inability to think in
abstract ways.
<<<

Not to mention most people's steadfast refusal to give up the imperative
programming language paradigm.  "What do you mean APL has no loops?  No
problem, it has a conditional goto, I can make my own..."
*** Randy Hyde

gwyn@smoke.brl.mil (Doug Gwyn) (05/15/91)

In article <14387@ucrmath.ucr.edu> rhyde@ucrmath.ucr.edu (randy hyde) writes:
>>There is not much need for understanding CPU architectures for most
>>programming, and it can even impair one's ability to code portably.
>You should read what Donald Knuth had to say about this in vol I of ACP.

I'm quite aware of what he said, and also that he has since said
that were he to start writing the Art of Computer Programming series
now, he would be expressing the algorithms in a high-level language
such as Pascal or Web, instead of the pseudo-English goto-ful form
he used, and the implementations in MIX assembly would be omitted in
favor of the high-level language.