[comp.arch] He's not the only one at it again!

cdshaw@cs.UAlberta.CA (Chris Shaw) (07/24/90)

In article gillett@ceomax.dec.com (Christopher Gillett) writes:
>In article peter@ficc.ferranti.com (Peter da Silva) writes:
>>In article gillett@ceomax.dec.com (Christopher Gillett) writes:
>>> Aha! Lets presume for a moment that you are truly a computer scientist,
>>> and that you buy into all the stuff that computer "science" teaches.
>>
>>Which computer "science"? The real one, or the straw man Bruce Karsh and
>>you keep bringing up.
>
>That's exactly my point! IMHO, there's really no such thing as
>"Computer Science". 

What is your point, that you give straw man arguments, or that you have this
well-known opinion?

>Physics, chemistry, biology, mathematics, are all real sciences.

Bull. Mathematic is not a "science". It is rigorous philosophy. Mathematics
intends to show things which are true regardless of what observations can be
made in the real world. In Kantian terms, Mathematics has essence,
but no existence.

>There is a fundamental underpinning for everything, and everything within 
>these fields procedes from a well understood, provable set of facts. 

Also total nonsense. Name the alleged "provable facts" of biology. By 
"provable" I take it to mean that mathematical rigor must be applied.
The same is true of all the "hard sciences". There are NO PROVABLE FACTS.
There are only observations that conform reasonably to theory.
Let me give you a clear-cut example. Before 1890, a Physicist would tell you
that Newton's Laws were the Laws of Nature at any level, including the atomic.
He no doubt would claim to be able to "prove it". By 1920, atomic and subatomic
observations gave rise to a whole new atomic theory, and a whole new set of
"provable facts".

So what does this mean? Physics is not science? No, it means that the job of 
the scientist is to create and "prove" theories from observations. "Proof" is
much weaker in the scientific sense, because a new observation could knock it
all down. That is, there's always this implicit disclaimer that says "this
could all be wrong".

>Some elements of computer science certainly exhibit these traits, but for the
>most part it all seems to spring forth from a mostly subjective, arguable
>basis. I think that we should stop holding out our discipline as a science
>and call it what it really is...engineering.

The basic problem with this point is that you have failed to name what is 
this "mostly subjective, arguable basis". Sure, program indenting is arguable
and subjective. So what? Program indenting is computer practice, just as lab
technique in any of the "hard sciences" is physics of biological practice.
And yes, a binary tree is an arbitrary notion, but this doesn't mean that you
can't methematically prove things about binary trees.

>Borland is but one example of a company whose success is based upon
>their ability to deliver "performance products". For the environment and
>audience they've defined as market targets, their products are excellent.

Sure, but the problem is that these products are not what they claim to be.
Borland's products are not "Compilers", they are miniature special-purpose
compiler-based operating systems. In some cases, they are not compilers for
the languages that they claim. In other words, I'm complaining about Borland
lying to me about what their products are.

None of these "performance products" would exist on the IBM PC if the damn
machine itself wasn't so grossly limited by its OS. 

>Griping about Turbo Whatever not running in some foreign environment (like
>Double DOS) is akin to griping about how hard it is to get your date to ride
>in your cool new garbage truck. You need the right tool for the right job.

Again more nonsense. One of the basic problems with the Borland products is
that they bypass the operating system. Its strength (speed) is also its
weakness (unportability). Turbo xxx is in fact a step backwards in terms of
computer history, to the bad old days when you had to re-write every program
for each new machine. It was a pretty good observation made by whoever in the
50's that you could write programs with a consistent programmer interface,
but which ran on different machines. Hence FORTRAN.

The benefits of portability are clear, and there is no need to restate them.
However, some people beleive that portability is not important, because they
will never have to port their program to a new machine type. However, it's
a tiny minority of people out there who don't get bitten by unportability
at some point.

>>Peter da Silva.   `-_-'
>
>Christopher Gillett               gillett@ceomax.dec.com


--
Chris Shaw     University of Alberta
cdshaw@cs.UAlberta.ca           Now with new, minty Internet flavour!
CatchPhrase: Bogus as HELL !

gillett@ceomax..dec.com (Christopher Gillett) (07/25/90)

In article <1990Jul23.231717.2766@cs.UAlberta.CA> cdshaw@cs.UAlberta.CA (Chris Shaw) writes:
>In article gillett@ceomax.dec.com (Christopher Gillett) writes:
>>Borland is but one example of a company whose success is based upon
>>their ability to deliver "performance products". For the environment and
>>audience they've defined as market targets, their products are excellent.
>
>Sure, but the problem is that these products are not what they claim to be.
>Borland's products are not "Compilers", they are miniature special-purpose
>compiler-based operating systems. In some cases, they are not compilers for
>the languages that they claim. In other words, I'm complaining about Borland
>lying to me about what their products are.
>

Go back and read the first paragraph.  I wrote that "For the environment and
audience they've defined as market targets, their products are excellent".
What I've heard so far is that they've experienced troubles making Turbo xxx
products work in environments other than what Borland themselves defined
as the "correct operating environment".  So who cares if they write to screen
memory all the time?  In a previous life, I wrote thousands of lines of Turbo
Pascal (and Turbo C when it became available).  The stuff that I wrote was
all pretty much boring run-of-the-mill applications that were designed to run
on your basic 640K, monochrome-or-EGA, hard disk PC-class machines.  I used
to write to screen memory all the time.  Doing so was quick, and the O/S
didn't prohibit it.  When I had to port my C applications to other machines,
I simply replaced the I/O module, and everything else clicked into place.

Turbo xxx (Pascal and C) are true native code compilers...pure and simple.
They are not certainly not operating systems.

>>Griping about Turbo Whatever not running in some foreign environment (like
>>Double DOS) is akin to griping about how hard it is to get your date to ride
>>in your cool new garbage truck. You need the right tool for the right job.
>
>Again more nonsense. One of the basic problems with the Borland products is
>that they bypass the operating system. Its strength (speed) is also its
>weakness (unportability). Turbo xxx is in fact a step backwards in terms of
>computer history, to the bad old days when you had to re-write every program
>for each new machine. 

Why would you have to rewrite every program just because you've chosen to 
use something from Borland?  Yes, if you use some of the stuff in their
standard libraries, or if you grab onto any of their language extensions
you're locking yourself in, but places where this happens are all reasonably
well documented so you can stay clear if need be.

I've got applications, written in C, that will compile without modification
(and without radical conditional compilation) in environments as diverse as
Ultrix, VMS, AmigaDOS, and MS-DOS.  There's nothing magic in how to do it,
and it certainly doesn't require any amazing engineering skills.

Yes, you can write nonportable, unfriendly code with Turbo whatever. 
But you can also write robust, portable code as well.

>It was a pretty good observation made by whoever in the
>50's that you could write programs with a consistent programmer interface,
>but which ran on different machines. Hence FORTRAN.

Yeah, and back then as now you could do all sorts of things to guarentee that
your FORTRAN program would never be portable.  Everyone here has probably 
seen all the trickery with oversubscripting arrays to write into memory, or
loading up arrays with machine code and then executing it, etc.  Nobody
blames the language or the compiler vendor for that.

BTW, I didn't reply to the portions of your posting dealing with the nature
and definition of science and whether or not computer science is a legitimate
science.  Not that I don't want to discuss it further, but I'm sick of all
the "drop dead" mail from .edu types and all the "get the f*** out of 
comp.arch" mail from the hardware types.  Let's continue that discussion
offline or someplace else.

>Chris Shaw     University of Alberta


/Chris

---
Christopher Gillett               gillett@ceomax.dec.com
Digital Equipment Corporation     {decwrl,decpa}!ceomax.dec.com!gillett
Hudson, Taxachusetts              (508) 568-7172

cik@l.cc.purdue.edu (Herman Rubin) (07/25/90)

In article <1990Jul23.231717.2766@cs.UAlberta.CA>, cdshaw@cs.UAlberta.CA (Chris Shaw) writes:
> In article gillett@ceomax.dec.com (Christopher Gillett) writes:
> >In article peter@ficc.ferranti.com (Peter da Silva) writes:
> >>In article gillett@ceomax.dec.com (Christopher Gillett) writes:
> >>> Aha! Lets presume for a moment that you are truly a computer scientist,
> >>> and that you buy into all the stuff that computer "science" teaches.
> >>
> >>Which computer "science"? The real one, or the straw man Bruce Karsh and
> >>you keep bringing up.
> >
> >That's exactly my point! IMHO, there's really no such thing as
> >"Computer Science". 
> 
> What is your point, that you give straw man arguments, or that you have this
> well-known opinion?
> 
> >Physics, chemistry, biology, mathematics, are all real sciences.
> 
> Bull. Mathematic is not a "science". It is rigorous philosophy. Mathematics
> intends to show things which are true regardless of what observations can be
> made in the real world. In Kantian terms, Mathematics has essence,
> but no existence.

Mathematics is pure grammar, but not philosophy of any kind.  I disagree 
about the existence part, however.  Computer science is the same sort of
thing.

But even pure grammar can have results on such things as how fast a program
can do something, given assumptions (these come from the hardware and the
nature of the problem) about how the components work.  Discussions about
how much effort is required to compute a given function to a given accuracy
predate the existence of non-human computers.

			................................

>                       It was a pretty good observation made by whoever in the
> 50's that you could write programs with a consistent programmer interface,
> but which ran on different machines. Hence FORTRAN.

Not according to the facts.  The only language I know of in any remotely
wide usage at the time Fortran was created was (ugh) COBOL.  Fortran was
created specifically for casual programming on the IBM 704.  It was only
after it was produced that it was observed that it could be used on other
machines.  The extremely poor attempt to produce a machine-independent 
computational language ALGOL was the followup.

> The benefits of portability are clear, and there is no need to restate them.
> However, some people beleive that portability is not important, because they
> will never have to port their program to a new machine type. However, it's
> a tiny minority of people out there who don't get bitten by unportability
> at some point.

Full portability is likely to be very costly.  It is frequently possible to
do a fair job by providing for extensibility in the language.  There have
been a few languages which allow adding types, including some Fortrans. 
C++ does it in a somewhat clumsy way.  The original Fortran allowed open
function calls, but not other open subroutines.  Inlining may or may not
achieve this.  There are many features of Fortran determined by the architecture
of the target machine, and very limiting, as was the idea that Fortran was not
for the construction of system libraries (at least when it was produced).  Other
languages, like C, also show the influence of the target machines.

It is very difficult to remove limitations of a language after the language is
specified.  It is very difficult to overcome the weaknesses in hardware after
the design is set.  The presence or absence of a single hardware instruction
can provide a large factor in the comparison of algorithms.  A change in 
relative timings of instructions can greatly modify the choice of algorithms.
An intelligent computer scientist or machine designer will take all this into
account, and will also recognize that it is easy to miss important things.
-- 
Herman Rubin, Dept. of Statistics, Purdue Univ., West Lafayette IN47907
Phone: (317)494-6054
hrubin@l.cc.purdue.edu (Internet, bitnet)	{purdue,pur-ee}!l.cc!cik(UUCP)

zenith-steven@cs.yale.edu (Steven Ericsson Zenith) (07/26/90)

In article <2400@l.cc.purdue.edu>, cik@l.cc.purdue.edu (Herman Rubin) writes:
|>Mathematics is pure grammar, but not philosophy of any kind.  I disagree 
|>about the existence part, however.  Computer science is the same sort of
|>thing.

From Webster's Ninth New Collegiate Dictionary:

Mathematics: The science of numbers and their operations, interrelations,
combinations, generalizations, and abstractions and of space configurations
and their structure, measurement, transformations and generalizations.

Philosophy: [...] 3 [...] b: a theory underlying or regarding a sphere of
activity or thought <the ~ of cooking><~ of science>.

|>>                       It was a pretty good observation made by
whoever in the
|>> 50's that you could write programs with a consistent programmer interface,
|>> but which ran on different machines. Hence FORTRAN.
|>
|>Not according to the facts.  

Oh .. really?

|> The only language I know of in any remotely
|>wide usage at the time Fortran was created was (ugh) COBOL.  Fortran was
|>created specifically for casual programming on the IBM 704.  It was only
|>after it was produced that it was observed that it could be used on other
|>machines.  The extremely poor attempt to produce a machine-independent 
|>computational language ALGOL was the followup.

From Juliussen and Juliussen's Computer Industry Almanac:

1944 Grace Murray Hopper starts a distinquished career in the computer industry
     by being the first programmer for the Mark 1.
1953 IBM ships its first stored program computer, the 701. [...]
1954 FORTRAN is created by John Backus at IBM following his 1953 SPEEDCO
program.
     Harlan Herrick runs first successful FORTRAN program.
1954 Gene Amdahl develops the first operating system, used on IBM 704.
1957 FORTRAN is introduced.
1958 ALGOL, first called IAL (International Algebraic Language), is presented
     in Zurich.
1959 COBOL is defined by the Conference on Data Systems Languages (Codasyl)
     based on Grace Hoppers's Flow-Matic.

[Sorry no earlier mention of Flow-Matic]

|>  The extremely poor attempt to produce a machine-independent 
|>computational language ALGOL was the followup.

This is an unjustifiable comment, since ALGOL has had a far more profound
influence on the design of programming languages than either FORTRAN or
COBOL.

|>Herman Rubin, Dept. of Statistics, Purdue Univ., West Lafayette IN47907
|>Phone: (317)494-6054
|>hrubin@l.cc.purdue.edu (Internet, bitnet)	{purdue,pur-ee}!l.cc!cik(UUCP)
                                                                         
--
Steven Ericsson Zenith              *            email: zenith@cs.yale.edu
Fax: (203) 466 2768                 |            voice: (203) 432 1278
| "The world is a sacred vessel; It is not something that can be acted upon. |
|     Those who act on it destroy it; Those who hold on to it lose it."      |
Yale University Dept of Computer Science 51 Prospect St New Haven CT 06520 USA

jlg@lanl.gov (Jim Giles) (07/26/90)

From article <25630@cs.yale.edu>, by zenith-steven@cs.yale.edu (Steven Ericsson Zenith):
> In article <2400@l.cc.purdue.edu>, cik@l.cc.purdue.edu (Herman Rubin) writes:
> [...]
> |>  The extremely poor attempt to produce a machine-independent 
> |>computational language ALGOL was the followup.
> 
> This is an unjustifiable comment, since ALGOL has had a far more profound
> influence on the design of programming languages than either FORTRAN or
> COBOL.

Well yes, ALGOL _has_ had an enormous _negative_ impact on language
design (that still continues today).  Features like 'call-by-name' are
now recognized as bad by nearly everyone.  But, features like the
'compound statement,' which were found to be detremental to programmer
productivity in the mid-70s, are still widely thought (by non-designers)
to be a good idea.

In fact, as far as I can tell, the design of ALGOL can be divided into
three distinct groups of features:

1) Features that it shared with Fortran (so, for these features,
   Fortran is the "profound influence").

2) Features which are different, but completely irrelevant (like using
   ":=" instead of "=" for assignment).

3) Features which were later found to be a bad idea.  (Take a look at
   the ALGOL 60 'switch' some time.)

To be sure, later versions of ALGOL had some interesting things that
have had a lasting impact - _BUT_, none of these new things appear
to be original to ALGOL (such things as 'struct' data types, which
appear to be cleaned-up versions of features already found in other
languages (including COBOL)).

In fact, only two features, that I can find, are original to ALGOL and
have a continuing positive influence on language design: if-then-else
and while().  These are important, and I don't want to denegrate their
worth, but they aren't sufficient to warrant your conclusion about
the disproportionate worth of ALGOL.  The majority of ALGOL features
which have a positive impact on new language design were those that
ALGOL got from Fortran and other sources.

Now, having said all that, I will also say that ALGOL had a more
important impact in the early days by becoming the _lingua_franca_
of the international programming community.  This has nothing to
do with ALGOL as a programming language - it is a recognition of
its value as a publishing language.

J. Giles

jlg@lanl.gov (Jim Giles) (07/26/90)

Whoops!!  I intended my last post to send followups to
comp.lang.misc, which is where this particular discussion
belongs.  Please send your followups there.  (If you read
this before you follow-up - which you should!  I never
send a followup until I've read all the relevant messages
first.  That way, I don't end up saying the same thing as
30 others have already said.)

J. Giles

cik@l.cc.purdue.edu (Herman Rubin) (07/26/90)

In article <25630@cs.yale.edu>, zenith-steven@cs.yale.edu (Steven Ericsson Zenith) writes:
> In article <2400@l.cc.purdue.edu>, cik@l.cc.purdue.edu (Herman Rubin) writes:

			.........................

> From Juliussen and Juliussen's Computer Industry Almanac:
> 
> 1944 Grace Murray Hopper starts a distinquished career in the computer industry
>      by being the first programmer for the Mark 1.
> 1953 IBM ships its first stored program computer, the 701. [...]
> 1954 FORTRAN is created by John Backus at IBM following his 1953 SPEEDCO
> program.
>      Harlan Herrick runs first successful FORTRAN program.
> 1954 Gene Amdahl develops the first operating system, used on IBM 704.
> 1957 FORTRAN is introduced.
> 1958 ALGOL, first called IAL (International Algebraic Language), is presented
>      in Zurich.
> 1959 COBOL is defined by the Conference on Data Systems Languages (Codasyl)
>      based on Grace Hoppers's Flow-Matic.

> [Sorry no earlier mention of Flow-Matic]

Well, I did get my dates somewhat wrong.  But there was considerable similarity
between the 701 and 704 and 709, and it is still the case that Fortran was
written specifically for  that machine design.

> 
> |>  The extremely poor attempt to produce a machine-independent 
> |>computational language ALGOL was the followup.
> 
> This is an unjustifiable comment, since ALGOL has had a far more profound
> influence on the design of programming languages than either FORTRAN or
> COBOL.

You are, unfortunately, absolutely correct here.  Fortran was not intended to
be complete.  ALGOL was, and failed miserably here.  ALGOL was intended to be
a programming language adequate for all numerical computations on all machines.
But it did not handle all the hardware options a good programmer would use on
the existing machines at that time.  Hardware produced a simultaneous quotient
and remainder; ALGOL made no provision for it.  Even then, mathematicians were
using multiple precision computations.  Likewise, no provision for that,
although most hardware had.  A machine without overflow was unusual; again,
no provision in the language.  

I claim I have made a strong case against ALGOL being even a good programming
language for mathematics.  The weaknesses of ALGOL and Fortran are to a
considerable extent responsible for these instructions disappearing from
the hardware.
-- 
Herman Rubin, Dept. of Statistics, Purdue Univ., West Lafayette IN47907
Phone: (317)494-6054
hrubin@l.cc.purdue.edu (Internet, bitnet)	{purdue,pur-ee}!l.cc!cik(UUCP)

amull@Morgan.COM (Andrew P. Mullhaupt) (07/27/90)

In article <58091@lanl.gov>, jlg@lanl.gov (Jim Giles) writes:
> In fact, only two features, that I can find, are original to ALGOL and
> have a continuing positive influence on language design: if-then-else
> and while(). 

Recursion? Recursive data structures? Required data declarations? 
Scope and re-usable variable names? 

I guess you have prior art for these? There aren't many candidates
for beating ALGOL to the punch. 

One of the most important aspects of ALGOL is the grammar on which
is was based. Context free grammars have since been almost universal
for high level languages, excepting FORTRAN, Lisp, APL, and other
elderly relics. The simplicity which the context free grammar brings
to the language cannot be underestimated; it is both simple for the
programmer and for the compiler writer at the same time. 

Don't get me wrong; I use FORTRAN nearly every day and nothing else
I have access to would replace it. But I am glad that ALGOL has had
so much more influence on language design. Actually, FORTRAN serves
as a very powerful negative influence on language design - everyone
since FORTRAN has been wise enough to avoid a label based flow of 
control, and default typed variables are definitely out of style.
The list of deprecated features in FORTRAN (Equivalence, anyone?)
is long enough to uniquely qualify FORTRAN as the language whose
superb compilers succeed in the face of a wretched language design.
A quick glance at the FORTRAN 90 definition will tell you that the
development of FORTRAN is now playing catch-up with where the ALGOL
derivatives have gone years before.

Later,
Andrew Mullhaupt

P.S. I'm not sure, but I've only known of one or two high level
languages which I would call first rate, and neither one has
achieved really widespread use. I'm beginning to think that there
is some kind of conspiracy at work, but it's probably just mass
stupidity - er - I mean economics.

jlg@lanl.gov (Jim Giles) (07/28/90)

From article <1288@s8.Morgan.COM>, by amull@Morgan.COM (Andrew P. Mullhaupt):
> In article <58091@lanl.gov>, jlg@lanl.gov (Jim Giles) writes:
> [...]
> Recursion? Recursive data structures? Required data declarations? 
> Scope and re-usable variable names? 
> 
> I guess you have prior art for these? There aren't many candidates
> for beating ALGOL to the punch. 

LISP beat ALGOL to resursion.

ALGOL may indeed be the first with required declarations - doesn't alter
my previous statement: my response was to the claim tha only ALGOL had
effect on future language developement.  I could remember only two
positive features of ALGOL - now you've made it three.  Fortran has
contributed _many_ more than that.

In Fortran, all modules delimit distinct scopes.  Local names in one
module can be re-used as names in other modules.  So, Fortran had the
two features you mention.  What you probably _REALLY_ mean is _nested_
scope and static binding (where variables in the nested blocks 'cover
up' those outside).  As I've said in a previous post, these are not
universally regarded as positive features.  The belief that this method
was the proper way to delimit scope sure delayed the development of
'packages' or 'modules' for a number of years.

> One of the most important aspects of ALGOL is the grammar on which
> is was based. Context free grammars have since been almost universal
> for high level languages, excepting FORTRAN, Lisp, APL, and other
> elderly relics. The simplicity which the context free grammar brings
> to the language cannot be underestimated; it is both simple for the
> programmer and for the compiler writer at the same time. 

Context free means something different than you are using it for here.
What you are talking about (aparently) is free-form syntax (which is
a mixed bag - at least not all of the aspects of it as ALGOL defined
them are a good idea).  Context free has to do with the formal specification
of the syntax - Fortran is context free (in fact, it's LR(k) - it _would_
be LR(1) if blanks had been significant).

J. Giles

ok@goanna.cs.rmit.oz.au (Richard A. O'Keefe) (07/28/90)

This is really a comp.lang.misc issue; I'd redirect followups but
can't remember how.  Sorry.

In article <58091@lanl.gov>, jlg@lanl.gov (Jim Giles) writes:

> In fact, as far as I can tell, the design of ALGOL can be divided into
> three distinct groups of features:

> 1) Features that it shared with Fortran (so, for these features,
>    Fortran is the "profound influence").

> 2) Features which are different, but completely irrelevant (like using
>    ":=" instead of "=" for assignment).

> 3) Features which were later found to be a bad idea.  (Take a look at
>    the ALGOL 60 'switch' some time.)

The Algol 60 switch belongs in category (1), as it's a straight steal
of Fortran's "computed GOTO".

It isn't clear what := is supposed to be irrelevant to.

Algol imposed no arbitrary limit on the length of variable names.
- This falls into none of Giles's categories.  F90 relaxes the limit.

Algol was the first *well-known* language to use recursion, and this
led to the publication of the idea of stack frames and such.
- This feature is in F90 but falls into none of Giles's categories.

Algol was the first well-known language where the size of an array
was not fixed at compile time.
- This feature is in F90 but falls into none of Giles's categories.

Algol did not require continuation to be indicated by punching
anything-other-than-0 in column 6.
- This feature is in F90 but doesn't obviously fall into Giles's categories.

Algol's "for" statement was obviously based on Fortran's DO statement,
but it eliminated many restrictions.  The simple fact of being able to
write zero-trip loops was one of the things which made Algol 60 nicer
than Fortran for many algorithms.
- This cleanup is in F77

Algol did not inherit the idea of *requiring* local variables to be
typed from Fortran; has this now been found to be a bad idea?

Ironically, Giles praises the "while" statement, but Algol 60 *HAS*
no "while" statement.  (That's in Algol 60.1, which came years later.)
(Algol 60 has a while-like variation of "for", but no "while" as such.)
-- 
Science is all about asking the right questions.  | ok@goanna.cs.rmit.oz.au
I'm afraid you just asked one of the wrong ones.  | (quote from Playfair)

amull@Morgan.COM (Andrew P. Mullhaupt) (07/30/90)

In article <58372@lanl.gov>, jlg@lanl.gov (Jim Giles) writes:
> From article <1288@s8.Morgan.COM>, by amull@Morgan.COM (Andrew P. Mullhaupt):
> > In article <58091@lanl.gov>, jlg@lanl.gov (Jim Giles) writes:
> > [...]
> LISP beat ALGOL to resursion.
Maybe so. I must have the wrong time stamp in my head for Lisp: 1963.

> Context free means something different than you are using it for here.
No. It refers to the grammar by which the language can be parsed.
> What you are talking about (aparently) is free-form syntax (which is
> a mixed bag - at least not all of the aspects of it as ALGOL defined
> them are a good idea).  Context free has to do with the formal specification
> of the syntax - Fortran is context free (in fact, it's LR(k) - it _would_
> be LR(1) if blanks had been significant).

Well, FORTRAN might end up having an LR(k) grammar, but that was certainly
not a FORTRAN invention. Aho, Sethi and Ullman credit Knuth in 1965
with the introduction of LR grammars. Also: FORTRAN has a context
free grammar, but you don't get much for it until FORTRAN 77 and its
block style statements. Most people would consider the Algol 60
Report the first use of BNF to define a programming language, and 
that the equivalence between BNF and context free grammars was almost
instantaneously understood as a consequence of this report. 

The more I think about this question, the more I think that the 
languages which become widespread aren't really the best ones,
and this is likely to remain the same. I am told that X windows
(which is another case of success by default) was the subject of
a remark by Rob Pike: "Sometimes you can fill a vacuum...and it
still sucks." The landscape of widespread software: UNIX, DOS, C,
FORTRAN, BASIC, COBOL, Turbo Pascal, PostScript, Lisp, APL, ...
all look like vacuum fillers to me. I think that the problem is
that people can design software on this scale only by solving
_some_ of the problems in front of them, and talking the others
under the rug. I have often had conversations with APL fanatics
in which their first remark was "well if you're going to talk about
interpreter overhead then you really don't understand APL..." as
if they always had all the time in the world to run their code.

It is one thing to look on the way we program computers as a great
achievement. It's pretty hard to have been involved in programming
since 1969 and not be stunned by what is now possible. On the other
hand, when you look at how everything gets done in reality, the idea
that theory gets put into practice fades pretty quickly. Practice
got ahead of theory even at the birth of FORTRAN, and we still have to
take the bad with the good. 

Oh, since you all asked: Algol 68 and (maybe) Extended Pascal were
the languages I had in mind as first rate. But if pressed, I would
retreat from these partially religious positions and admit that
all high level languages are second rate at best. And of course 
almost any high level language is better than assembler, but this
is not for any intrinsic reason, i.e. _it need not be so_.

Later,
Andrew Mullhaupt

zenith-steven@cs.yale.edu (Steven Ericsson Zenith) (07/30/90)

In article <58372@lanl.gov> jlg@lanl.gov (Jim Giles) writes:
> [...]  my response was to the claim tha only ALGOL had
>effect on future language developement. [...]

In the postings that I have seen on the subject I don't think anyone
was claiming that ALGOL was the *only* language to affect future
language development. I did claim that ALGOL had a greater influence
than FORTRAN or COBOL - this may or may not be true. The discussion
seems to have degenerated into one of bigotry - "My favorite language
is better than your favorite language", and I don't think we stand to
gain from this. 

Nor did I intend to imply by my claim that ALGOL was a perfect
language - it surely wasn't - but it's ok to make honest mistakes,
that's how we learn and how our science evolves. It is an incredible
arrogance - characteristic of Comp Science and Architecture (and
perhaps other sciences too) to believe that we have the ultimate
solution in our hands and that solutions cannot evolve - arrogance
enforced generally by making a plea to the god of "architectural
purity". Such arrogance is ignorant and blind (I should place these
comments in context. I'm not directing these comments at any member of
this discussion - I am venting frustrations born by my involvement in
other language and architectural developments.) - for some reason I
expect people to be more dispassionate about such things.

For its time ALGOL was a reasonable solution given what had gone
before, the style of its definition in the ALGOL 60 report was perhaps
more significant than the language itself - it led the way for many
other language reports - even today. Of the various language reports
that sat on my desk as I wrote the Occam 2 Reference (- and please,
let's not get into the deficiencies of that language design :-) the
ALGOL 60 report was the most useful example to have.

Regards,
Steven.
-- 
--
Steven Ericsson Zenith              *            email: zenith@cs.yale.edu
Fax: (203) 466 2768                 |            voice: (203) 432 1278
| "The world is a sacred vessel; It is not something that can be acted upon. |

zenith-steven@cs.yale.edu (Steven Ericsson Zenith) (07/30/90)

In article <3478@goanna.cs.rmit.oz.au> ok@goanna.cs.rmit.oz.au (Richard A. O'Keefe) writes:
>In article <58091@lanl.gov>, jlg@lanl.gov (Jim Giles) writes:
>
>> 2) Features which are different, but completely irrelevant (like using
>>    ":=" instead of "=" for assignment).
>
>It isn't clear what := is supposed to be irrelevant to.

a := b	means "assign the value of b to a".
a = b	means "a is equal to b".

The use of := distinguishes assignment from equality, thus prevents
overloading a single operator and IMHO is a much nicer solution to the
C hack == used to ovecome the same problem.

Steven.
-- 
--
Steven Ericsson Zenith              *            email: zenith@cs.yale.edu
Fax: (203) 466 2768                 |            voice: (203) 432 1278
| "The world is a sacred vessel; It is not something that can be acted upon. |

zenith-steven@cs.yale.edu (Steven Ericsson Zenith) (07/31/90)

In article <1990Jul30.143530.24295@phri.nyu.edu>, roy@phri.nyu.edu (Roy
Smith) writes:
|>zenith-steven@cs.yale.edu (Steven Ericsson Zenith) writes:
|>> The use of := distinguishes assignment from equality [...] and IMHO is a
|>> much nicer solution to the C hack == used to ovecome the same problem.
|>
|>	I agree that differentiating assignment from equality testing is
|>good, but why is using {:=, =} (no, that's not some kind of overgrown
|>smiley face!) any better or worse than using {=, ==}?  One might argue that
|>one is easier to type, or less likely to cause typos, or something like
|>that, but to call C's version a hack seems like you're overreacting a bit.
|>If you are going to invent a multi-ascii-character token for assignment,
|>why not "<-"?

                
Keyboard ease is a good pragmatic point. The principal objection to use
of = as a symbol meaning assignment is that the symbol most commonly
means equality outside of Computer Science. Things are complicated in
C since, in that language, assignment is an expression. The main argument
for := as an assignment operator is familiarity, since it is now widely used
with this meaning. This symbol is used in Occam for that reason and also
in Ease, although Ease extends its use to include allocation (the declaration
and possible initialisation of variables).

My objection to <- would be typographic. Fixed width font versions (widely used
for program listing) make the visual distance between the < and dash
exagerated. Functional languages often use -> (see ML, Haskell), 
and indeed Ease uses -> for type constraint. But I don't really
like the visual distance between dash and >. The problem is born from a 
desire to maintain compatibility with the ASCII character set.


--
Steven Ericsson Zenith              *            email: zenith@cs.yale.edu
Fax: (203) 466 2768                 |            voice: (203) 432 1278
"The tower should warn the people not to believe in it." - P.D.Ouspensky
Yale University Dept of Computer Science 51 Prospect St New Haven CT 06520 USA