[comp.lang.misc] The Fundamental Concept of Programming language X

wsmith@mdbs.UUCP (Bill Smith) (12/30/89)

I have an idea regarding programming language design that I would 
like to toss out in order to see how accurately it describes the
high level thinking that goes into a new programming language.

The idea is "fundamental concept."  The fundamental concept of
a language is the data structure, control structure or style issue
that distinguishes the language and must be understood in detail before 
proficient use of the programming language can begin.

I think one of the weaknesses of this idea is that many language have
more than one fundmamental concept and thus argument can begin what
the true fundamental concept is.  (In other words, the idea is ill defined
for some langauges.)

Here is a table of my list of fundamental concepts (The whole point of 
this exercise is to abstract away the syntax of the language and get to 
what conceptual background is necessary to understand the language.   
Once this is done, the language may be summarized in one or two sentences.  
Once this is done, one could write original summaries for a language 
and the work backwards till a possibly useful programming language is 
created.) Feel free to add or subtract from this list.   The idea is to 
generate discussion.

Language	Fundamental Concept

Lisp		Lists + dynamic scope
Scheme		Closures + static scope
Fortran 	Arrays + fixed allocation
C		pointers + dynamic allocation
Ada		Generics (?)
FORTH		Threaded code + postfix notation
Cobol		Formatting of data
BASIC		the statement as a unit of program
SNOBOL		Strings

Bill Smith
pur-ee!mdbs!wsmith

(Please let me know if this made it out.  I haven't seen anything in 
comp.lang.misc for almost a month and can't tell if is because it has
been inactive or because our feed to the group has been cut off
by the nearby institution.)
(My opinions only)

eric@snark.uu.net (Eric S. Raymond) (01/01/90)

In <1470@mdbs.UUCP> Bill Smith wrote:
> Language	Fundamental Concept
> 
> Lisp		Lists + dynamic scope

Nah. Not all LISPs are dynamically scoped. One major dialect allows you to
choose dynamic or lexical scoping on a per-variable basis! Make that

Lisp		Lists as fundamental type + code as data + dynamic allocation

> Scheme		Closures + static scope

Vot's dis "static scope"? Make that "lexical scope".

> Fortran 	Arrays + fixed allocation
> C		pointers + dynamic allocation

Interesting claim, considering that malloc isn't part of C. Try

C		HLL as portable assembler, pointers as first-class citizens

> Ada		Generics (?)

Ada doesn't have a unifying concept. That's its fatal flaw.

> FORTH		Threaded code + postfix notation
> Cobol		Formatting of data
> BASIC		the statement as a unit of program
> SNOBOL		Strings

Make that "strings as only data type"...and add

APL		arrays as the only data type

ok@quintus.UUCP (Richard A. O'Keefe) (01/01/90)

In article <1TzGbd#48h84g=eric@snark.uu.net> eric@snark.uu.net (Eric S. Raymond) writes:
>In <1470@mdbs.UUCP> Bill Smith wrote:
>> Language	Fundamental Concept
>> SNOBOL		Strings
>Make that "strings as only data type"...and add

This is not true.  Snobol 4 has integers, floats, patterns, arrays, tables,
user-defined records, and lots of other stuff in addition to strings.
Perhaps
	SNOBOL		user-extensible pattern matching

sbw@naucse.UUCP (Steve Wampler) (01/02/90)

From article <JV.90Jan1162520@mhres.mh.nl>, by jv@mh.nl (Johan Vromans):
> In article <1470@mdbs.UUCP> wsmith@mdbs.UUCP (Bill Smith) writes:
>> C		pointers + dynamic allocation
> Allocation is static.

Hmmm, I need a little lesson in current terminology.  Since C uses
stack-based activation records, isn't the allocation of local variables
dynamic?  I understand what Johan is saying, just wondering how one
distinguishes memory management in C from FORTRAN (Ok, the 'old'
FORTRANs)?  Wouldn't it more appropriate to refer to C has having
static sizing with dynamic allocation?  FORTRAN then has static
sizing and allocation, while LISP, Icon, etc have dynamic sizing and
allocation.

> Add:
> 
> ICON		Goal-oriented evaluation. Dynamic datatypes.
> Pattern matching and "fail" concept from SNOBOL.
 
Just to nitpick, it's *Icon*, not ICON - the name doesn't stand for
anything.  Also, there is no 'pattern-matching' as in SNOBOL, though
it can be simulated fairly easily.  It's replaced with 'string scanning'.
-- 
	Steve Wampler
	{....!arizona!naucse!sbw}

tneff@bfmny0.UU.NET (Tom Neff) (01/02/90)

By all means add:

	RPG		syntax by indentation :-)

and pride requires me to add my own effort:

	DISMAL		self modifying do-case code, fuzzy loops,
			mixed case keywords, 029 keypunch charset,
			and the #goto preprocessor directive

Note - Distribution tapes of full DISMAL are now available.  You
will need to supply twenty blank 6250BPI tapes.  (DISMAL only
occupies 14 tapes, but we need more blanks ourselves.)  Send email
to gwnorth@beautyeh.tor.ca for more information.  Also ask about
Tiny DISMAL, which is distributed on two 128k bubble memory chips.
-- 
"A man came into the the office one day and said he  \|/  Tom Neff
was a sailor.  We cured him of that." - Mark Twain,  -O-  tneff@bfmny0.UU.NET
on his days as a doctor's apprentice in California.  /|\  uunet!bfmny0!tneff

jv@mh.nl (Johan Vromans) (01/02/90)

In article <1470@mdbs.UUCP> wsmith@mdbs.UUCP (Bill Smith) writes:
> C		pointers + dynamic allocation
Allocation is static.

> SNOBOL		Strings
Pattern matching + "fail" concept. Snobol4 has lots of datatypes.
Datatypes are dynamic (variables are not fixed to a specific type).
No distinction between programs and data ("EVAL" and "CODE"
functions). This property is shared with lisp.

Add:

ICON		Goal-oriented evaluation. Dynamic datatypes.
Pattern matching and "fail" concept from SNOBOL.


Please resume once in a while, this will be an interesting topic.

Johan
--
Johan Vromans				       jv@mh.nl via internet backbones
Multihouse Automatisering bv		       uucp: ..!{uunet,hp4nl}!mh.nl!jv
Doesburgweg 7, 2803 PL Gouda, The Netherlands  phone/fax: +31 1820 62944/62500
------------------------ "Arms are made for hugging" -------------------------

peter@ficc.uu.net (Peter da Silva) (01/02/90)

> 	RPG		syntax by indentation :-)

> and pride requires me to add my own effort:

> 	DISMAL		self modifying do-case code, fuzzy loops,

OK, I'll bite:

	GROOVY		unreal data types, cool statements, memory
			allocation via "I-need-my-own-space-man" construct.

An examination of the programming language 'groovy'.

This language is freed from the rigid and authoritarian requirements of
conventional languages. It has no control structures, as such, because
these imply an uncool imposition of authority. Instead of a 'go to' concept,
the language knows where you're coming from at all times. Similarly, there is
no such thing as an incorrect statement, just 'cool' and 'uncool'.

The main data type is 'unreal'. Variables of type unreal can be allocated
at any time with the "I-need-my-own-space-man" construct. This construct
is always cool, even if used in an uncool context. Unreal variables may be
'in touch' or 'far out'. To access a far out variable, you have to dig it.
You can't dig anything that's not unreal... this is an uncool construct.

There are some restrictions. Groovy programs that are too long can elicit
messages such as 'Are you trying to lay a trip on me', or 'This is really
getting me down, man'. These errors are not fatal, merely type 'Hey, it's
cool man, take your time' on the console and processing will (eventually)
proceed.


-- 
`-_-' Peter da Silva. +1 713 274 5180. <peter@ficc.uu.net>.
 'U`  Also <peter@ficc.lonestar.org> or <peter@sugar.lonestar.org>.
"It was just dumb luck that Unix managed to break through the Stupidity Barrier
and become popular in spite of its inherent elegance." -- gavin@krypton.sgi.com

johnl@esegue.segue.boston.ma.us (John R. Levine) (01/02/90)

In article <15050@bfmny0.UU.NET> tneff@bfmny0.UU.NET (Tom Neff) writes:
>	RPG		syntax by indentation :-)

Actually, it's syntax by card column, but RPG does what it does, which is
to produce straightforward reports from fixed form input, as well as anything
else.  (Later versions added all sorts of junk for indexed files and
complicated calculations on the way from input to output, but any language
gets baroque in its later days.  Just look at Fortran 8X.)

Personally I prefer awk, which is very much the Unix equivalent of RPG, but
for keeping inventory records on a 4K byte machine, you could have done
far worse.
-- 
John R. Levine, Segue Software, POB 349, Cambridge MA 02238, +1 617 864 9650
johnl@esegue.segue.boston.ma.us, {ima|lotus|spdcc}!esegue!johnl
"Now, we are all jelly doughnuts."

roelof@idca.tds.PHILIPS.nl (R. Vuurboom) (01/02/90)

In article <1470@mdbs.UUCP> wsmith@mdbs.UUCP (Bill Smith) writes:
>
>The idea is "fundamental concept."  The fundamental concept of
>a language is the data structure, control structure or style issue
>that distinguishes the language and must be understood in detail before 
>proficient use of the programming language can begin.
>
>I think one of the weaknesses of this idea is that many language have
>more than one fundmamental concept and thus argument can begin what
>the true fundamental concept is.  (In other words, the idea is ill defined
>for some langauges.)
>

I'm wondering if you're sufficiently diffentiating between the ideas 
of "primitive concept" and "fundamental concept".

A concept can be called primitive if it is generally not profitable to 
break it up into more primitive parts in that particular world of discourse.

In C ints and doubles can be considered primitives. In Lisp, lists would
be considered by many to be a primitive concept.

Fundamental concepts (imho) are those concepts that provide guidelines or
viewpoints in the associated world of discourse. They provide the methods and
rationalization for construction of more complex concepts and their
corresponding abstraction and intercourse. 

In C data abstraction (via typdef for example) and procedure abstraction
(function definition) are to me the fundamental concepts although 
pointers take a good second place (pointing to an object is a strategy
of object naming). 

A (the?) fundamental _implementation_ concept of C (but also Algol and Pascal) 
are that these languages have a stack-based run-time environment. 

sean@aipna.ed.ac.uk (Sean Matthews) (01/03/90)

In article <1470@mdbs.UUCP> wsmith@mdbs.UUCP (Bill Smith) writes:
>The idea is "fundamental concept."  The fundamental concept of
>a language is the data structure, control structure or style issue
>that distinguishes the language and must be understood in detail before 
>proficient use of the programming language can begin.
>
[List that to me seems far from fundamental]

The notion of what is a fundamental concept of a programming language
is interesting, but it seems to me that what has been attributed to a
programming language as `fundamental' here is wrong; being far too
concerned with the superficialities.

Take first some of the the sorts of models that we have for computers:

1. the Von Neumann stored program/ processor machine.

2. the lambda calculus.

2a. combinatory logic

3. communicating processes of various sorts.

4. proof search (e.g. sequent calculus)

and consider them in turn.

The Von Neumann machine, which is a sort of sophisticated Turing
machine, comes to us in all sorts of forms, and has had a decisive
influence (necessarily, probably) on the way computers have been built
up to now, and (unfortunately, definitely) on the way computer
scientists have been educated up to now.

It has produced languages such as Fortran, Algol{58,60,68,W}, COBOL,
C, PL/1 Sumula, Pascal, Modula{2}, C, BCPL, Ada, Mesa, BASIC etc.

And these languages have had such an influence on computer programming
that most books on programming language design deal only with what to
me seem the minor differences between them.  The similarities between
them all are such that to be able to program in one is to be able to
program in all of them.  One simply decides what is needed for a
particular task.  C is a relatively low level language, that is suited
to certain types of system programming, Fortran is good for bashing
numbers, but they are all essentially the same.

One could make a secondary distinction between the languages that are
stack based (e.g. C) , and those that are not (e.g. Cobol), the
former would then have a closer afinity with functional programming
languages (see below).  Or one could distinguish between those that
have some form of dynamic storage (e.g. Pascal) and those that do not
(e.g. Fortran).

Ok, so I exagerate a bit, a typical COBOL or BASIC programmer is going
to have more difficulty coming to terms with Algol-68 or C than the
other way around, but even an Algol-68 programmer is parochial---he
still thinks in terms of a load and store architecture with state and
sequential operations and procedures.

I think that this model has no theoretical virtue, all it offers is
real world efficiency.  In evidence, I would say that it shows in the
`formal' descriptions that I see in computer text books of algorithms,
which amount to writing out the algorithm in pseudo PL/1, and which
make me want to fling the book against a wall.  These are not what I
would call formal specifications, since they hopelessly confuse
implementation with design.  This does not condem the design, just
that I think programmers identify it with what computers are, instead
of recognising that it is a subset (and an expressively weak subset).

The second model that I have listed is the Lambda calculus. Which is a
much nicer theoretical (and practical) model than load and store.  I
have written real programs in what amounted to pure lambda calculus,
and I would rather use it than Algol-68 for most things (but that is
personal prejudice).  What languages use it?  Simple functional
programing languages, including some forms of lisp, such as Lispkit
lisp (is that right, if Peter Henderson is on the net he can correct
me).  Pure Lisp was a partial stab at it but it has dynamic scoping,
which makes a mockery of any pretentions it has to pure functional
language-hood.  One of the interesting things about lisp is how the
load and store mindset was so entrenched, even in the early sixties
that the people who developed it immediately started shift it from a
lambda calculus foundation, which it could have been settled on
properly after a slightly unsteady start, to being like more like
Fortran.  And to head off any criticism, yes I do realise that
something had to be done to Lisp to make it practical on the machines
of the time, but did they have to be *quite* so violent.

One interesting side effect of adding imperative structures (a.k.a.
assignment) to Lisp while trying to fix the problem of dynamic
scoping, was to produce the idea of closures, which appears in Scheme,
and can be seen in retrospect in Smalltalk (which you could think of
as going in the other direction, frantically chasing *some* of the
virtues of functional programming systems from an Algol derived base).
Of course the actual motivation behind Smalltalk was very different,
but it gives a good place to file the ideas that underlie it; it would
be a mistake to think that objects are an entirely novel concept.

Further out, there is the typed lambda calculus, which allows much
stronger, and flexible, typing systems than what passes for them in
Algol style languages.  (and then there are `type theories' which are
about as strongly typed as you can get and sacrifice nothing in the
process except bugs, and produce `programming languages' such as
NuPRL, but that is a discussion for another time).  You get this in
programming languages such as ML.

In fact, ML is an interesting case, since it started out with pure
typed lambda calulus and then added as much as was necessary of
imperative structures as was needed (it turns out to be very little),
and got a programming language, that, for complex tasks, does not need
to be much less efficient than C.

I have also listed a section 2a, of combinatory logic, which is where
we could put languages like FP, and even APL.  FP is clearly
combinatory, while I would argue that the way that APL works is much
more influenced by the combinatory style that it encourages than by
the fact that it works with arrays (and I know that there is
assignment in APL, but in practice people try to avoid it, and write
one liners instead, since that is the more powerful, and natural way
to think in APL, the pity is that it has such a limited set of
combinators).

So what are the advantages that Lambda calculus gives us?  Higher
order functions, powerful typing systems, no side effects (except when
we want them) compact code, more abstraction.  The elimination of the
Von Neumann bottleneck, both in the programmer, and in the machine; we
no longer have to think in terms of one thing at a time.

What about paradigm three?  Well here there is the notion of Hoare's
CSP (yes and Milner's CCS, but let us not complicate things) which is
best seen in occam (no capitals), which is a load store language that
has been heavily modified to remove the Von Neumann bottleneck
mentioned above.  In fact it is barely still a load store language,
since there is so much influence having data flowing actively around
the machine; going and getting itself processed, instead of waiting in
the queue.  A much less important implementation is in Ada, where the
tasking system is similar, but it is not obvious enough or well enough
integrated to affect the mindset of the programmer, being bolted onto
the side instead.

paradigm four is maybe the least obvious to the typical programmer.
Prolog is the most obvious implimentation of this sort of thing.  I am
not sure what we learn from it (I think it---i.e. prolog, and more
generally, logic programming---is overrated as a programming
technique) but it does at least give some good ideas about what can be
done with pattern matching and non-determinism, and it does give a new
set of intellectual tools with which to approach any problem.

Sorry if the above is a little scrappy, but it was typed in one go and
it is not going to be edited.

Anyway I think I got my point across.  Of course in practice, none of
these categorisations is complete, inf only because most of them are
`infected' with the idea of state from the Von Neumann machine---the
only ones that are not impress the theoreticians but are not much use
in practice.

the important thing is to realise that the real differences are not of
the sorts of data that programming languages deal with, but the way
that they allow you to approach what you are doing.

Think in terms of natural languages; (unfortunately) most programmers
are in the position of linguists who have studied Italian, and Spanish
and maybe a little German and generalise cheerfully from that
knowledge while having no idea that there are languages such as
Chinese or Arabic.  Think of the interesting ideas they would get from
exposure to all the products of those cultures; how might their ideas
of what is a narrative, or what is a poem, would be enlarged.

Sean

jv@mh.nl (Johan Vromans) (01/03/90)

Add:

PERL		Dynamic datatypes and allocation. Pattern matching
                with regexps. Data formatting. Access to system calls.
                Weird syntax. Lisp-like scope rules. No distinction
                between program and data ("EVAL" and "DO" functions).
		PERL = Practical Extraction and Report Language.

Modula-2	Separate module definition and implementation. Heavily
                block-structured. Very strong typing. Combines the
                benefits of Pascal with the very few good features of C.


Please resume once in a while, this will be an interesting topic.

Johan
--
Johan Vromans				       jv@mh.nl via internet backbones
Multihouse Automatisering bv		       uucp: ..!{uunet,hp4nl}!mh.nl!jv
Doesburgweg 7, 2803 PL Gouda, The Netherlands  phone/fax: +31 1820 62944/62500
------------------------ "Arms are made for hugging" -------------------------

--
Johan Vromans				       jv@mh.nl via internet backbones
Multihouse Automatisering bv		       uucp: ..!{uunet,hp4nl}!mh.nl!jv
Doesburgweg 7, 2803 PL Gouda, The Netherlands  phone/fax: +31 1820 62944/62500
------------------------ "Arms are made for hugging" -------------------------

snorri@rhi.hi.is (Snorri Agnarsson) (01/03/90)

I would like to distinguish more precisely on memory management issues;
LISP, Icon, SNOBOL, etc. not only have dynamic allocation of memory
(which you also have in C and Pascal) but more importantly have
dynamic DEallocation of memory.  In my opinion this is a crucial aspect.
-- 
Snorri Agnarsson		|  Internet:	snorri@rhi.hi.is
Taeknigardur, Dunhaga 5		|  UUCP:	..!mcvax!hafro!rhi!snorri
IS-107 Reykjavik, ICELAND

roelof@idca.tds.PHILIPS.nl (R. Vuurboom) (01/03/90)

In article <1782@aipna.ed.ac.uk> sean@aipna.ed.ac.uk (Sean Matthews) writes:

[A number of interesting points a few of which I would like to look at.] 
|
|Take first some of the the sorts of models that we have for computers:
|
|1. the Von Neumann stored program/ processor machine.
|2. the lambda calculus.
|2a. combinatory logic
|3. communicating processes of various sorts.
|4. proof search (e.g. sequent calculus)
|
1 is obviously a computer model, 3 could be used as a computer model. 
But 2 2a and 4? Looks a lot more like language models to me.

|
|The Von Neumann machine, which is a sort of sophisticated Turing
|machine, comes to us in all sorts of forms, and has had a decisive
|influence (necessarily, probably) on the way computers have been built
|up to now, and (unfortunately, definitely) on the way computer
|scientists have been educated up to now.
|
Imo the von Neumann machine is a sort of sophisticated Turing machine
in the same way an integer variable is a sort of sophisticated bit
box. You can emulate a computer with a Turing machine. You can emulate
an integer with a bit box. What you don't do is conceptualize an integer
as a bit box and you (at least I don't) _conceptualize_ a computer as
a Turing machine - the difference in levels of sophistication and 
complexity are just too great (at least for me :-). As I see it,
a Turing machine is no longer used by computer scientists as a computer
paradigm (except for some admitedly interesting theoretical results).
 
|It has produced languages such as Fortran, Algol{58,60,68,W}, COBOL,
|C, PL/1 Sumula, Pascal, Modula{2}, C, BCPL, Ada, Mesa, BASIC etc.
|

Agreed:
The von Neumann machine has produced assignment (imperative programming)
and this is a backbone of all the above languages. A futher separation
into stack-based and non-stack based is possible as you noted. 

|
|The second model that I have listed is the Lambda calculus. Which is a
|much nicer theoretical (and practical) model than load and store.  

As always it depends what your practice is as to whether or not its 
practical...

|In fact, ML is an interesting case, since it started out with pure
|typed lambda calulus and then added as much as was necessary of
|imperative structures as was needed (it turns out to be very little),
|and got a programming language, that, for complex tasks, does not need
|to be much less efficient than C.
|
Maybe this is a little too simplistic, if you're using C to develop
a model which can be naturally implemented in ML you're probably
right. Vice verse you could lose out.
 
|
|So what are the advantages that Lambda calculus gives us?  Higher
|order functions, powerful typing systems, no side effects (except when
|we want them) compact code, more abstraction.  The elimination of the
|Von Neumann bottleneck, both in the programmer, and in the machine; we
|no longer have to think in terms of one thing at a time.
|
And let's not forget the disadvantages: its next to impossible to
model any number of real world situations with it.

|What about paradigm three?  Well here there is the notion of Hoare's
|CSP (yes and Milner's CCS, but let us not complicate things) which is
|best seen in occam (no capitals), which is a load store language that
|has been heavily modified to remove the Von Neumann bottleneck
|mentioned above.  

Agreed.
|
|paradigm four is maybe the least obvious to the typical programmer.
|Prolog is the most obvious implimentation of this sort of thing.  I am
|not sure what we learn from it (I think it---i.e. prolog, and more
|generally, logic programming---is overrated as a programming
|technique) but it does at least give some good ideas about what can be
|done with pattern matching and non-determinism, and it does give a new
|set of intellectual tools with which to approach any problem.
|
I would consider pattern matching and non-determinism to be more 
mechanisms than policies (viewpoints,paradigms). The paradigm is not
pattern matching and non-determinism, the paradigm is descriptive
programming vs procedural programming. I find prolog a little
clumsy in its form of expression (syntax) but where relevant it
offers in my view a higher (and better) form of abstraction than
does lamda calculus (functional programming) or procedural programming.
 
|
|the important thing is to realise that the real differences are not of
|the sorts of data that programming languages deal with, but the way
|that they allow you to approach what you are doing.
|
Can't agree more.

dave@PRC.Unisys.COM (David Lee Matuszek) (01/04/90)

When I read Bill Smith's note about the "fundamental concept" of a
programming language, I wanted to reply, but was too busy at the time.
Just as well, since Sean Matthews made exactly the points I wanted to,
and did a good job of it, so:  Yeah.  What he said.

When I was in academia I frequently taught the Programming Languages
courses, and have never been satisfied with any of the available
textbooks.  As Matthews points out, most of them never look beyond Von
Neumann machines; those that do usually present summaries of the
languages (missing the important ideas), without any clear idea of
their foundations.

I have often thought that, if I ever write a Programming Languages
book, I would like to organize it in terms of mathematical models:  a
short introduction to the lambda calculus, followed by a chapter on
Lisp; an introduction to resolution with Horn clauses, followed by a
chapter on Prolog; an introduction to string rewriting systems,
followed by a chapter on Snobol IV (OK, I guess these days it would
have to be Icon, but Snobol IV is less contaminated by the Von Neumann
architecture); and so on.  And, of course, a large section on Von
Neumann languages, and all the fine distinctions among those.

My reason for posting (aside from just wanting to second Sean
Matthews) is to ask:  does anyone know if there yet exists a P.L. book
along the lines I have just described?

-- Dave Matuszek (dave@prc.unisys.com)
-- Unisys Corp. / Paoli Research Center / PO Box 517 / Paoli PA  19301
-- Any resemblance between my opinions and those of my employer is improbable.
  << Those who fail to learn from Unix are doomed to repeat it. >>

toma@tekgvs.LABS.TEK.COM (Tom Almy) (01/04/90)

In article <1470@mdbs.UUCP> wsmith@mdbs.UUCP (Bill Smith) writes:

>I think one of the weaknesses of this idea is that many language have
>more than one fundmamental concept and thus argument can begin what
>the true fundamental concept is.  (In other words, the idea is ill defined
>for some langauges.)

Are you concerned with the "Fundamental Concept" at the time the language
was new, or in current use? It makes a difference.

>Language	Fundamental Concept

Assembler	labels + mnemonic instructions

>Lisp		Lists + dynamic scope
Scoping rules depend on dialect. But certainly true for original Lisp
>Fortran 	Arrays + fixed allocation
Originally "algebraic syntax" since only other alternative was assembler.
>FORTH		Threaded code + postfix notation
Originally true, but threaded code not used in all implementations.
I feel the key concept is all tokens execute the same way.

Smalltalk	All data are objects + browser
Pascal		Local functions
Modula-2	Modules + processes
BCPL		lvalue/rvalue concept + single data type (covers integers,
			arrays, functions, labels)
Algol		Dynamic scoping + structured programming


Tom Almy
toma@tekgvs.labs.tek.com
Standard Disclaimers Apply

debray@cs.arizona.edu (Saumya K. Debray) (01/04/90)

In article <1782@aipna.ed.ac.uk>, sean@aipna.ed.ac.uk (Sean Matthews) writes:
> paradigm four is maybe the least obvious to the typical programmer.
> Prolog is the most obvious implimentation of this sort of thing.  I am
> not sure what we learn from it ...

Computation as controlled deduction?  Program semantics as (simple)
statements of logic (as opposed to the rather baroque structures one
must construct for the denotational semantics of most languages)?

-- 
Saumya Debray		CS Department, University of Arizona, Tucson

     internet:   debray@cs.arizona.edu
     uucp:       uunet!arizona!debray

new@udel.edu (Darren New) (01/04/90)

> FORTH		Threaded code + postfix notation

Actually, the threaded code and postfix notation are not the fundamental
parts of FORTH, but rather seem to me to be there to support:

  FORTH		user-defined data representations, operations, and literals.

For example, try taking the "string" data type out of a language and then
putting it back in without changing the compiler.  Only FORTH and 
LISP allow this sort of thing, and LISP only clumsily. But doing this
requires good user-control over parsing and code generation, thus the 
threaded code and postfix notation. -- Darren

gudeman@cs.arizona.edu (David Gudeman) (01/04/90)

In article  <1782@aipna.ed.ac.uk> sean@aipna.ed.ac.uk (Sean Matthews) writes:
>The Von Neumann machine, which is a sort of sophisticated Turing
>machine,

Hardly.  They are related in the same sense that the lambda calculus
is related to combinatory logic, or in the same way that grammers are
related to predicate logic.

>...It has produced languages such as Fortran, Algol{58,60,68,W}, COBOL,
>C, PL/1 Sumula, Pascal, Modula{2}, C, BCPL, Ada, Mesa, BASIC etc.
>
>And these languages have had such an influence on computer programming
>that most books on programming language design deal only with what to
>me seem the minor differences between them.

True enough...

>I think that this model has no theoretical virtue, all it offers is
>real world efficiency.

I disagree with this.  Algorithms were written long before there were
computers to execute them.  Part of the problem with this view is that
you are not distinguishing between the trivial models of algorithmic
computing (Turing machines and Von Neumann machines) and the
fundamental concept of an algorithm, which I define as: "a description
of a problem solution by systematic changes in a computing device,
based on rules of computation and on the current state of the device,
where the state of the device at termination encodes the problem
solution."  This view of computation is not only theoretically useful,
it is of great practical importance in finding solutions to many
problems that are conveniently solved by such a method.

In fact this method of computation is so convenient and intuitive,
that is is often used as a way of understanding programs written in
other models of computation.  For example, the lambda calculus is
often treated as a rewrite system, and I claim that rewrite systems
are essentially algorithmic in nature.  The same may be said of
grammers, and even predicate logic to a certain extent.

>So what are the advantages that Lambda calculus gives us?  Higher
>order functions,

Algorithmic systems can have higher-order algorithms.

>powerful typing systems,

completely unrelated to the functional/algorithmic difference.

>no side effects (except when we want them)

No side effects _ever_, even when we want them.  Unless you are giving
up the purity of the paradigm.

>compact code

more related to the data abstractions available than to computational model.

> more abstraction.

depends on your definition of abstraction.  If you define "more
abstract" as "less algorithmic", then this is true.

>The elimination of the Von Neumann bottleneck

This presumes that the Von Neumann machine is the basis of all
algorithmic languagess.  In particular it presumes that algorithmic
methods are limited to computation on integers, and that structured
objects must be handled one element at a time.  This is far from true,
and many of the languages you listed as Von Neumann languages present
facilities for treating aggregates at any desired level.  What they
frequently don't have is _primitives_ for treating aggregates as a
whole, but these can be (and usually are) supplied in libraries.

A very large percentage of the criticisms of "convential languages"
made by proponents of other systems are really criticisms of the data
types and primitive operations available in the languages.  For
example, one often sees FP proponents claim that the array data
structure is too limited because of its word-at-a-time operations.
But this criticism (which I agree with) doesn't imply that we should
switch to a functional language with sequences, it implies that we
should introduce sequences into the languages we prefer.
-- 
					David Gudeman
Department of Computer Science
The University of Arizona        gudeman@cs.arizona.edu
Tucson, AZ 85721                 noao!arizona!gudeman

jschon@infmx.UUCP (John Schonholtz) (01/04/90)

In article <1470@mdbs.UUCP> wsmith@mdbs.UUCP (Bill Smith) writes:
>
>Language	Fundamental Concept
>
>Ada		Generics (?)

Not quite!  As anyone who has worked extensively with the language
knows...

Language	Fundamental Concept

Ada		Information Hiding

In fact, the DOD did such a good job here that, when examining
most Ada programs, it is impossible to find out anything.

;-)

-- 
*	John Schonholtz		* It's the end of the world as we know it
*   Informix Software, Inc.     * And I feel fine
*        415/926-6813		* pyramid!infmx!jschon      

ljdickey@water.waterloo.edu (L.J.Dickey) (01/09/90)

In article <1782@aipna.ed.ac.uk> sean@aipna.ed.ac.uk (Sean Matthews) writes:

>I have also listed a section 2a, of combinatory logic, which is where
>we could put languages like FP, and even APL.  FP is clearly
>combinatory, while I would argue that the way that APL works is much
>more influenced by the combinatory style that it encourages than by
>the fact that it works with arrays (and I know that there is
>assignment in APL, but in practice people try to avoid it, and write
>one liners instead, since that is the more powerful, and natural way
>to think in APL, the pity is that it has such a limited set of
>combinators).

Readers are encouraged to track the progress of SAX (Sharp APL for UNIX),
which has introduced some new combinators.  Their vocabulary uses words
like "noun", "verb", "adverb", and "conjunction".  An adverb corresponds
to a mathematical operator because it acts on a function ("verb") and 
returns a function as a result.  Their new adverbs and conjunctions
are (for me) the most interesting combinators.  Some APL users are as
concerned about the proliferation of parentheses as, apparently,
some combinatory logicians were, and so find practical use for the
commute adverb, for instance.

A recent research paper on this topic by some involved one way or another
with SAX, appears in APL Quote Quad, Volume 19, Number 4, August 1989,
on page 197.

-- 
    L. J. Dickey, Faculty of Mathematics, University of Waterloo.
	ljdickey@water.UWaterloo.ca	ljdickey@water.BITNET
	ljdickey@water.UUCP		..!uunet!watmath!water!ljdickey
	ljdickey@water.waterloo.edu	

alan@oz.nm.paradyne.com (Alan Lovejoy) (01/09/90)

What the world needs is a programming language whose "fundamental concept"
("raison d'etre") is abstraction. Period.

"Give me a place to stand on, and with this lever..."

Enough said.

____"Congress shall have the power to prohibit speech offensive to Congress"____
Alan Lovejoy; alan@pdn; 813-530-2211; AT&T Paradyne: 8550 Ulmerton, Largo, FL.
Disclaimer: I do not speak for AT&T Paradyne.  They do not speak for me. 
Mottos:  << Many are cold, but few are frozen. >>     << Frigido, ergo sum. >>

sakkinen@tukki.jyu.fi (Markku Sakkinen) (01/10/90)

In article <6917@pdn.paradyne.com> alan@oz.paradyne.com (Alan Lovejoy) writes:
-What the world needs is a programming language whose "fundamental concept"
-("raison d'etre") is abstraction. Period.
-
-"Give me a place to stand on, and with this lever..."
-
-Enough said.

CLU ?

Markku Sakkinen
Department of Computer Science
University of Jyvaskyla (a's with umlauts)
Seminaarinkatu 15
SF-40100 Jyvaskyla (umlauts again)
Finland

nick@lfcs.ed.ac.uk (Nick Rothwell) (01/10/90)

In article <2886@water.waterloo.edu>, ljdickey@water (L.J.Dickey) writes:
>Readers are encouraged to track the progress of SAX (Sharp APL for UNIX),
>which has introduced some new combinators.  Their vocabulary uses words
>like "noun", "verb", "adverb", and "conjunction".  An adverb corresponds
>to a mathematical operator because it acts on a function ("verb") and 
>returns a function as a result.

Why are "operators" different to functions?

What do you do about full higher-order facilities (e.g. higher-order
functions returning higher-order functions)? "adadadadadverbs"?

>    L. J. Dickey, Faculty of Mathematics, University of Waterloo.

		Nick.
--
Nick Rothwell,	Laboratory for Foundations of Computer Science, Edinburgh.
		nick@lfcs.ed.ac.uk    <Atlantic Ocean>!mcvax!ukc!lfcs!nick
~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~
  "...all these moments... will be lost in time... like tears in rain."

amull@Morgan.COM (Andrew P. Mullhaupt) (01/11/90)

In article <1532@castle.ed.ac.uk>, nick@lfcs.ed.ac.uk (Nick Rothwell) writes:
> 
> Why are "operators" different to functions?
 The situation is much like that in APL2, to wit:
They can have functions as arguments. The result of an operator is
(can be) a function suitable for use as an argument to another
operator. The exceptions in APL are "jot" which is not any kind of
thing, (it only exists as part of the "jot-dot" operator), and slash,
which is the symbol for both "compress" (a function) and "reduce"
(an operator). (A mistake graven in stone by now...)


Functions, therefore, are not 'first class data' since the functions
which operate on them are not functions. (This is a lot like Lisp).

> 
> What do you do about full higher-order facilities (e.g. higher-order
> functions returning higher-order functions)? "adadadadadverbs"?
> 

I think here, the APL2 answer is that you don't. Operators are not
(so far as I can tell) legitimate areguments to other operators. The
reason for this is not unlike what you have guessed - you cannot
express the definition of an "operator on operators" in the present
syntax. English is not so limited; things which modify adverbs
are very frequently adjectives.

Later,
Andrew Mullhaupt
> 		Nick.

barmar@think.com (Barry Margolin) (01/12/90)

In article <666@s5.Morgan.COM> amull@Morgan.COM (Andrew P. Mullhaupt) writes:
>In article <1532@castle.ed.ac.uk>, nick@lfcs.ed.ac.uk (Nick Rothwell) writes:
>> Why are "operators" different to functions?
>They can have functions as arguments.
...
>Functions, therefore, are not 'first class data' since the functions
>which operate on them are not functions. (This is a lot like Lisp).

Lisp doesn't have the distinction between functions and operators.  In
Lisp, functions can take functions as arguments and return them as values.
For instance, in Common Lisp one can write:

(defun complement (function)
  #'(lambda (&rest args)
      (not (apply function args))))

(complement <function>) returns a function that returns true when the
original function returns false, and vice versa (this function will be part
of ANSI Common Lisp).  One could then write:

(setf (symbol-function 'nequal) (complement #'equal))

or

(funcall (complement #'member) 'foo *some-list*)

And here's a simple definition of APL's / operator:

(defun compress (function)
  #'(lambda (&rest args)
      (reduce function args)))

;;; If Lisp didn't already have an n-argument +, one could write:
(defun sum (&rest values)
  (apply (compress #'+) values))
--
Barry Margolin, Thinking Machines Corp.

barmar@think.com
{uunet,harvard}!think!barmar

matt@oddjob.uchicago.edu (Matt Crawford) (01/13/90)

Keeping it to a few words, as Bill did, how about

Icon		string pattern matching and backtracking
Prolog		relational pattern matching and backtracking

?

________________________________________________________
Matt Crawford	     		matt@oddjob.uchicago.edu

franka@mentor.com (Frank A. Adrian) (01/13/90)

In article <32824@news.Think.COM> barmar@think.com (Barry Margolin) writes:
>In article <666@s5.Morgan.COM> amull@Morgan.COM (Andrew P. Mullhaupt) writes:
>>In article <1532@castle.ed.ac.uk>, nick@lfcs.ed.ac.uk (Nick Rothwell) writes:
>>> Why are "operators" different to functions?
>>They can have functions as arguments.
>...
>>Functions, therefore, are not 'first class data' since the functions
>>which operate on them are not functions. (This is a lot like Lisp).
>
>Lisp doesn't have the distinction between functions and operators.  In
>Lisp, functions can take functions as arguments and return them as values.
>For instance, in Common Lisp one can write:
>
>(defun complement (function)
>  #'(lambda (&rest args)
>      (not (apply function args))))
>
>(complement <function>) returns a function that returns true when the
>original function returns false, and vice versa (this function will be part
>of ANSI Common Lisp).  One could then write:
>
>(setf (symbol-function 'nequal) (complement #'equal))
>
>or
>
>(funcall (complement #'member) 'foo *some-list*)
>

Not to dispute Barry's assertion, because, as he states, one CAN pass and return
functions in CL, but one still has to use the clunky (function ...) special form and
usually one of the (funcall ...) or (apply ...) functions.  This does make handling
functional slots in function calls non-symmetrical with respect to data slots.
I much prefer Scheme's symmetric handling of all slots in a function call.

By the way, this seems to arise in CL due to the use of a different value cell
for the functional binding of an identifier.  Can somebody tell me where this
differentiation arose and why?
-- 

Frank A. Adrian
Mentor Graphics, Inc.
franka@mntgfx.com

johnl@esegue.segue.boston.ma.us (John R. Levine) (01/15/90)

In article <1990Jan13.152226.4722@mentor.com> you write:
>By the way, this seems to arise in CL due to the use of a different value
>cell for the functional binding of an identifier.  Can somebody tell me where
>this differentiation arose and why?

My Lisp 1.5 manual shows that this dates way back to the original version.
There weren't value or function cells, just property lists for atoms and the
A-list for lambda bindings.  Function values were stored as the EXPR
property for lambda expressions, the SUBR property for compiled code, or on
the A-list for LABEL bindings.  Values were stored as the APVAL property for
permanently bound values or else found on the A-list.

I suspect that it seemed more convenient not to have to worry about code and
data namespace collision, rather than any deep opinion about their relative
merits.  Perhaps someone who was around in the early 60's can recall what
the thinking was.
-- 
John R. Levine, Segue Software, POB 349, Cambridge MA 02238, +1 617 864 9650
johnl@esegue.segue.boston.ma.us, {ima|lotus|spdcc}!esegue!johnl
"Now, we are all jelly doughnuts."

aarons@syma.sussex.ac.uk (Aaron Sloman) (01/15/90)

wsmith@mdbs.UUCP (Bill Smith) writes:

> Date: 29 Dec 89 19:40:36 GMT
> Organization: MDBS Inc., Lafayette, IN
>
> I have an idea regarding programming language design that I would
> like to toss out in order to see how accurately it describes the
> high level thinking that goes into a new programming language.
>
> The idea is "fundamental concept."  The fundamental concept of
> a language is the data structure, control structure or style issue
> that distinguishes the language and must be understood in detail before
> proficient use of the programming language can begin.
>
POP-11
        Block-structure with multi-bracket syntax, collection of
        abnormal exit procedures (chain, exitfrom, exitto), use of
        "open" stack (Like Forth), "dynamic local expressions",
        lexical and dynamic binding, most kinds of numbers, records,
        vectors, arrays, strings, "words" (in a dictionary)
        procedures as data objects, stream processing via repeaters
        and consumers, list processing including pattern matcher,
        dynamic lists (limited lazy evaluation), lightweight
        processes, properties, class-procedures, syntax extendable
        by code-planting as well as macro expansion, autoloadable
        libraries, OOP subsystem, incremental compiler, screen
        editor as a sub-routine.

        (AIM: an easy language for beginners that "grows" as you
        learn, and remains useful for advanced R&D).

Actually - that list applies only to Poplog Pop-11. Alphapop (Byte
May 1988) and earlier dialects (POP2, POP-10, WPOP) contain a
subset.

All dialects of POP have at least the open stack, dynamic binding
of variables, garbage collector, incremental compilation, records,
arrays, vectors, strings, words, list processing, partial
application.

Also, I have mixed up language features with other features. But
from the point of view of a user there's no sharp boundary.

Aaron Sloman,
School of Cognitive and Computing Sciences,
Univ of Sussex, Brighton, BN1 9QH, England
    EMAIL   aarons@cogs.sussex.ac.uk
            aarons%uk.ac.sussex.cogs@nsfnet-relay.ac.uk
            aarons%uk.ac.sussex.cogs%nsfnet-relay.ac.uk@relay.cs.net
    BITNET: aarons%uk.ac.sussex.cogs@uk.ac
    UUCP:     ...mcvax!ukc!cogs!aarons
            or aarons@cogs.uucp

tchrist@convex.COM (Tom Christiansen) (01/16/90)

Could someone please summarize the one-liners on this topic?

thanks,

--tom

    Tom Christiansen                       {uunet,uiucdcs,sun}!convex!tchrist 
    Convex Computer Corporation                            tchrist@convex.COM
		 "EMACS belongs in <sys/errno.h>: Editor too big!"