[comp.lang.c] The Fundamental Concept of Programming language X

wsmith@mdbs.UUCP (Bill Smith) (12/30/89)

I have an idea regarding programming language design that I would 
like to toss out in order to see how accurately it describes the
high level thinking that goes into a new programming language.

The idea is "fundamental concept."  The fundamental concept of
a language is the data structure, control structure or style issue
that distinguishes the language and must be understood in detail before 
proficient use of the programming language can begin.

I think one of the weaknesses of this idea is that many language have
more than one fundmamental concept and thus argument can begin what
the true fundamental concept is.  (In other words, the idea is ill defined
for some langauges.)

Here is a table of my list of fundamental concepts (The whole point of 
this exercise is to abstract away the syntax of the language and get to 
what conceptual background is necessary to understand the language.   
Once this is done, the language may be summarized in one or two sentences.  
Once this is done, one could write original summaries for a language 
and the work backwards till a possibly useful programming language is 
created.) Feel free to add or subtract from this list.   The idea is to 
generate discussion.

Language	Fundamental Concept

Lisp		Lists + dynamic scope
Scheme		Closures + static scope
Fortran 	Arrays + fixed allocation
C		pointers + dynamic allocation
Ada		Generics (?)
FORTH		Threaded code + postfix notation
Cobol		Formatting of data
BASIC		the statement as a unit of program
SNOBOL		Strings

Bill Smith
pur-ee!mdbs!wsmith

(Please let me know if this made it out.  I haven't seen anything in 
comp.lang.misc for almost a month and can't tell if is because it has
been inactive or because our feed to the group has been cut off
by the nearby institution.)
(My opinions only)

eric@snark.uu.net (Eric S. Raymond) (01/01/90)

In <1470@mdbs.UUCP> Bill Smith wrote:
> Language	Fundamental Concept
> 
> Lisp		Lists + dynamic scope

Nah. Not all LISPs are dynamically scoped. One major dialect allows you to
choose dynamic or lexical scoping on a per-variable basis! Make that

Lisp		Lists as fundamental type + code as data + dynamic allocation

> Scheme		Closures + static scope

Vot's dis "static scope"? Make that "lexical scope".

> Fortran 	Arrays + fixed allocation
> C		pointers + dynamic allocation

Interesting claim, considering that malloc isn't part of C. Try

C		HLL as portable assembler, pointers as first-class citizens

> Ada		Generics (?)

Ada doesn't have a unifying concept. That's its fatal flaw.

> FORTH		Threaded code + postfix notation
> Cobol		Formatting of data
> BASIC		the statement as a unit of program
> SNOBOL		Strings

Make that "strings as only data type"...and add

APL		arrays as the only data type

sbw@naucse.UUCP (Steve Wampler) (01/02/90)

From article <JV.90Jan1162520@mhres.mh.nl>, by jv@mh.nl (Johan Vromans):
> In article <1470@mdbs.UUCP> wsmith@mdbs.UUCP (Bill Smith) writes:
>> C		pointers + dynamic allocation
> Allocation is static.

Hmmm, I need a little lesson in current terminology.  Since C uses
stack-based activation records, isn't the allocation of local variables
dynamic?  I understand what Johan is saying, just wondering how one
distinguishes memory management in C from FORTRAN (Ok, the 'old'
FORTRANs)?  Wouldn't it more appropriate to refer to C has having
static sizing with dynamic allocation?  FORTRAN then has static
sizing and allocation, while LISP, Icon, etc have dynamic sizing and
allocation.

> Add:
> 
> ICON		Goal-oriented evaluation. Dynamic datatypes.
> Pattern matching and "fail" concept from SNOBOL.
 
Just to nitpick, it's *Icon*, not ICON - the name doesn't stand for
anything.  Also, there is no 'pattern-matching' as in SNOBOL, though
it can be simulated fairly easily.  It's replaced with 'string scanning'.
-- 
	Steve Wampler
	{....!arizona!naucse!sbw}

tneff@bfmny0.UU.NET (Tom Neff) (01/02/90)

By all means add:

	RPG		syntax by indentation :-)

and pride requires me to add my own effort:

	DISMAL		self modifying do-case code, fuzzy loops,
			mixed case keywords, 029 keypunch charset,
			and the #goto preprocessor directive

Note - Distribution tapes of full DISMAL are now available.  You
will need to supply twenty blank 6250BPI tapes.  (DISMAL only
occupies 14 tapes, but we need more blanks ourselves.)  Send email
to gwnorth@beautyeh.tor.ca for more information.  Also ask about
Tiny DISMAL, which is distributed on two 128k bubble memory chips.
-- 
"A man came into the the office one day and said he  \|/  Tom Neff
was a sailor.  We cured him of that." - Mark Twain,  -O-  tneff@bfmny0.UU.NET
on his days as a doctor's apprentice in California.  /|\  uunet!bfmny0!tneff

jv@mh.nl (Johan Vromans) (01/02/90)

In article <1470@mdbs.UUCP> wsmith@mdbs.UUCP (Bill Smith) writes:
> C		pointers + dynamic allocation
Allocation is static.

> SNOBOL		Strings
Pattern matching + "fail" concept. Snobol4 has lots of datatypes.
Datatypes are dynamic (variables are not fixed to a specific type).
No distinction between programs and data ("EVAL" and "CODE"
functions). This property is shared with lisp.

Add:

ICON		Goal-oriented evaluation. Dynamic datatypes.
Pattern matching and "fail" concept from SNOBOL.


Please resume once in a while, this will be an interesting topic.

Johan
--
Johan Vromans				       jv@mh.nl via internet backbones
Multihouse Automatisering bv		       uucp: ..!{uunet,hp4nl}!mh.nl!jv
Doesburgweg 7, 2803 PL Gouda, The Netherlands  phone/fax: +31 1820 62944/62500
------------------------ "Arms are made for hugging" -------------------------

sean@aipna.ed.ac.uk (Sean Matthews) (01/03/90)

In article <1470@mdbs.UUCP> wsmith@mdbs.UUCP (Bill Smith) writes:
>The idea is "fundamental concept."  The fundamental concept of
>a language is the data structure, control structure or style issue
>that distinguishes the language and must be understood in detail before 
>proficient use of the programming language can begin.
>
[List that to me seems far from fundamental]

The notion of what is a fundamental concept of a programming language
is interesting, but it seems to me that what has been attributed to a
programming language as `fundamental' here is wrong; being far too
concerned with the superficialities.

Take first some of the the sorts of models that we have for computers:

1. the Von Neumann stored program/ processor machine.

2. the lambda calculus.

2a. combinatory logic

3. communicating processes of various sorts.

4. proof search (e.g. sequent calculus)

and consider them in turn.

The Von Neumann machine, which is a sort of sophisticated Turing
machine, comes to us in all sorts of forms, and has had a decisive
influence (necessarily, probably) on the way computers have been built
up to now, and (unfortunately, definitely) on the way computer
scientists have been educated up to now.

It has produced languages such as Fortran, Algol{58,60,68,W}, COBOL,
C, PL/1 Sumula, Pascal, Modula{2}, C, BCPL, Ada, Mesa, BASIC etc.

And these languages have had such an influence on computer programming
that most books on programming language design deal only with what to
me seem the minor differences between them.  The similarities between
them all are such that to be able to program in one is to be able to
program in all of them.  One simply decides what is needed for a
particular task.  C is a relatively low level language, that is suited
to certain types of system programming, Fortran is good for bashing
numbers, but they are all essentially the same.

One could make a secondary distinction between the languages that are
stack based (e.g. C) , and those that are not (e.g. Cobol), the
former would then have a closer afinity with functional programming
languages (see below).  Or one could distinguish between those that
have some form of dynamic storage (e.g. Pascal) and those that do not
(e.g. Fortran).

Ok, so I exagerate a bit, a typical COBOL or BASIC programmer is going
to have more difficulty coming to terms with Algol-68 or C than the
other way around, but even an Algol-68 programmer is parochial---he
still thinks in terms of a load and store architecture with state and
sequential operations and procedures.

I think that this model has no theoretical virtue, all it offers is
real world efficiency.  In evidence, I would say that it shows in the
`formal' descriptions that I see in computer text books of algorithms,
which amount to writing out the algorithm in pseudo PL/1, and which
make me want to fling the book against a wall.  These are not what I
would call formal specifications, since they hopelessly confuse
implementation with design.  This does not condem the design, just
that I think programmers identify it with what computers are, instead
of recognising that it is a subset (and an expressively weak subset).

The second model that I have listed is the Lambda calculus. Which is a
much nicer theoretical (and practical) model than load and store.  I
have written real programs in what amounted to pure lambda calculus,
and I would rather use it than Algol-68 for most things (but that is
personal prejudice).  What languages use it?  Simple functional
programing languages, including some forms of lisp, such as Lispkit
lisp (is that right, if Peter Henderson is on the net he can correct
me).  Pure Lisp was a partial stab at it but it has dynamic scoping,
which makes a mockery of any pretentions it has to pure functional
language-hood.  One of the interesting things about lisp is how the
load and store mindset was so entrenched, even in the early sixties
that the people who developed it immediately started shift it from a
lambda calculus foundation, which it could have been settled on
properly after a slightly unsteady start, to being like more like
Fortran.  And to head off any criticism, yes I do realise that
something had to be done to Lisp to make it practical on the machines
of the time, but did they have to be *quite* so violent.

One interesting side effect of adding imperative structures (a.k.a.
assignment) to Lisp while trying to fix the problem of dynamic
scoping, was to produce the idea of closures, which appears in Scheme,
and can be seen in retrospect in Smalltalk (which you could think of
as going in the other direction, frantically chasing *some* of the
virtues of functional programming systems from an Algol derived base).
Of course the actual motivation behind Smalltalk was very different,
but it gives a good place to file the ideas that underlie it; it would
be a mistake to think that objects are an entirely novel concept.

Further out, there is the typed lambda calculus, which allows much
stronger, and flexible, typing systems than what passes for them in
Algol style languages.  (and then there are `type theories' which are
about as strongly typed as you can get and sacrifice nothing in the
process except bugs, and produce `programming languages' such as
NuPRL, but that is a discussion for another time).  You get this in
programming languages such as ML.

In fact, ML is an interesting case, since it started out with pure
typed lambda calulus and then added as much as was necessary of
imperative structures as was needed (it turns out to be very little),
and got a programming language, that, for complex tasks, does not need
to be much less efficient than C.

I have also listed a section 2a, of combinatory logic, which is where
we could put languages like FP, and even APL.  FP is clearly
combinatory, while I would argue that the way that APL works is much
more influenced by the combinatory style that it encourages than by
the fact that it works with arrays (and I know that there is
assignment in APL, but in practice people try to avoid it, and write
one liners instead, since that is the more powerful, and natural way
to think in APL, the pity is that it has such a limited set of
combinators).

So what are the advantages that Lambda calculus gives us?  Higher
order functions, powerful typing systems, no side effects (except when
we want them) compact code, more abstraction.  The elimination of the
Von Neumann bottleneck, both in the programmer, and in the machine; we
no longer have to think in terms of one thing at a time.

What about paradigm three?  Well here there is the notion of Hoare's
CSP (yes and Milner's CCS, but let us not complicate things) which is
best seen in occam (no capitals), which is a load store language that
has been heavily modified to remove the Von Neumann bottleneck
mentioned above.  In fact it is barely still a load store language,
since there is so much influence having data flowing actively around
the machine; going and getting itself processed, instead of waiting in
the queue.  A much less important implementation is in Ada, where the
tasking system is similar, but it is not obvious enough or well enough
integrated to affect the mindset of the programmer, being bolted onto
the side instead.

paradigm four is maybe the least obvious to the typical programmer.
Prolog is the most obvious implimentation of this sort of thing.  I am
not sure what we learn from it (I think it---i.e. prolog, and more
generally, logic programming---is overrated as a programming
technique) but it does at least give some good ideas about what can be
done with pattern matching and non-determinism, and it does give a new
set of intellectual tools with which to approach any problem.

Sorry if the above is a little scrappy, but it was typed in one go and
it is not going to be edited.

Anyway I think I got my point across.  Of course in practice, none of
these categorisations is complete, inf only because most of them are
`infected' with the idea of state from the Von Neumann machine---the
only ones that are not impress the theoreticians but are not much use
in practice.

the important thing is to realise that the real differences are not of
the sorts of data that programming languages deal with, but the way
that they allow you to approach what you are doing.

Think in terms of natural languages; (unfortunately) most programmers
are in the position of linguists who have studied Italian, and Spanish
and maybe a little German and generalise cheerfully from that
knowledge while having no idea that there are languages such as
Chinese or Arabic.  Think of the interesting ideas they would get from
exposure to all the products of those cultures; how might their ideas
of what is a narrative, or what is a poem, would be enlarged.

Sean

snorri@rhi.hi.is (Snorri Agnarsson) (01/03/90)

I would like to distinguish more precisely on memory management issues;
LISP, Icon, SNOBOL, etc. not only have dynamic allocation of memory
(which you also have in C and Pascal) but more importantly have
dynamic DEallocation of memory.  In my opinion this is a crucial aspect.
-- 
Snorri Agnarsson		|  Internet:	snorri@rhi.hi.is
Taeknigardur, Dunhaga 5		|  UUCP:	..!mcvax!hafro!rhi!snorri
IS-107 Reykjavik, ICELAND

roelof@idca.tds.PHILIPS.nl (R. Vuurboom) (01/03/90)

In article <1782@aipna.ed.ac.uk> sean@aipna.ed.ac.uk (Sean Matthews) writes:

[A number of interesting points a few of which I would like to look at.] 
|
|Take first some of the the sorts of models that we have for computers:
|
|1. the Von Neumann stored program/ processor machine.
|2. the lambda calculus.
|2a. combinatory logic
|3. communicating processes of various sorts.
|4. proof search (e.g. sequent calculus)
|
1 is obviously a computer model, 3 could be used as a computer model. 
But 2 2a and 4? Looks a lot more like language models to me.

|
|The Von Neumann machine, which is a sort of sophisticated Turing
|machine, comes to us in all sorts of forms, and has had a decisive
|influence (necessarily, probably) on the way computers have been built
|up to now, and (unfortunately, definitely) on the way computer
|scientists have been educated up to now.
|
Imo the von Neumann machine is a sort of sophisticated Turing machine
in the same way an integer variable is a sort of sophisticated bit
box. You can emulate a computer with a Turing machine. You can emulate
an integer with a bit box. What you don't do is conceptualize an integer
as a bit box and you (at least I don't) _conceptualize_ a computer as
a Turing machine - the difference in levels of sophistication and 
complexity are just too great (at least for me :-). As I see it,
a Turing machine is no longer used by computer scientists as a computer
paradigm (except for some admitedly interesting theoretical results).
 
|It has produced languages such as Fortran, Algol{58,60,68,W}, COBOL,
|C, PL/1 Sumula, Pascal, Modula{2}, C, BCPL, Ada, Mesa, BASIC etc.
|

Agreed:
The von Neumann machine has produced assignment (imperative programming)
and this is a backbone of all the above languages. A futher separation
into stack-based and non-stack based is possible as you noted. 

|
|The second model that I have listed is the Lambda calculus. Which is a
|much nicer theoretical (and practical) model than load and store.  

As always it depends what your practice is as to whether or not its 
practical...

|In fact, ML is an interesting case, since it started out with pure
|typed lambda calulus and then added as much as was necessary of
|imperative structures as was needed (it turns out to be very little),
|and got a programming language, that, for complex tasks, does not need
|to be much less efficient than C.
|
Maybe this is a little too simplistic, if you're using C to develop
a model which can be naturally implemented in ML you're probably
right. Vice verse you could lose out.
 
|
|So what are the advantages that Lambda calculus gives us?  Higher
|order functions, powerful typing systems, no side effects (except when
|we want them) compact code, more abstraction.  The elimination of the
|Von Neumann bottleneck, both in the programmer, and in the machine; we
|no longer have to think in terms of one thing at a time.
|
And let's not forget the disadvantages: its next to impossible to
model any number of real world situations with it.

|What about paradigm three?  Well here there is the notion of Hoare's
|CSP (yes and Milner's CCS, but let us not complicate things) which is
|best seen in occam (no capitals), which is a load store language that
|has been heavily modified to remove the Von Neumann bottleneck
|mentioned above.  

Agreed.
|
|paradigm four is maybe the least obvious to the typical programmer.
|Prolog is the most obvious implimentation of this sort of thing.  I am
|not sure what we learn from it (I think it---i.e. prolog, and more
|generally, logic programming---is overrated as a programming
|technique) but it does at least give some good ideas about what can be
|done with pattern matching and non-determinism, and it does give a new
|set of intellectual tools with which to approach any problem.
|
I would consider pattern matching and non-determinism to be more 
mechanisms than policies (viewpoints,paradigms). The paradigm is not
pattern matching and non-determinism, the paradigm is descriptive
programming vs procedural programming. I find prolog a little
clumsy in its form of expression (syntax) but where relevant it
offers in my view a higher (and better) form of abstraction than
does lamda calculus (functional programming) or procedural programming.
 
|
|the important thing is to realise that the real differences are not of
|the sorts of data that programming languages deal with, but the way
|that they allow you to approach what you are doing.
|
Can't agree more.

toma@tekgvs.LABS.TEK.COM (Tom Almy) (01/04/90)

In article <1470@mdbs.UUCP> wsmith@mdbs.UUCP (Bill Smith) writes:

>I think one of the weaknesses of this idea is that many language have
>more than one fundmamental concept and thus argument can begin what
>the true fundamental concept is.  (In other words, the idea is ill defined
>for some langauges.)

Are you concerned with the "Fundamental Concept" at the time the language
was new, or in current use? It makes a difference.

>Language	Fundamental Concept

Assembler	labels + mnemonic instructions

>Lisp		Lists + dynamic scope
Scoping rules depend on dialect. But certainly true for original Lisp
>Fortran 	Arrays + fixed allocation
Originally "algebraic syntax" since only other alternative was assembler.
>FORTH		Threaded code + postfix notation
Originally true, but threaded code not used in all implementations.
I feel the key concept is all tokens execute the same way.

Smalltalk	All data are objects + browser
Pascal		Local functions
Modula-2	Modules + processes
BCPL		lvalue/rvalue concept + single data type (covers integers,
			arrays, functions, labels)
Algol		Dynamic scoping + structured programming


Tom Almy
toma@tekgvs.labs.tek.com
Standard Disclaimers Apply

debray@cs.arizona.edu (Saumya K. Debray) (01/04/90)

In article <1782@aipna.ed.ac.uk>, sean@aipna.ed.ac.uk (Sean Matthews) writes:
> paradigm four is maybe the least obvious to the typical programmer.
> Prolog is the most obvious implimentation of this sort of thing.  I am
> not sure what we learn from it ...

Computation as controlled deduction?  Program semantics as (simple)
statements of logic (as opposed to the rather baroque structures one
must construct for the denotational semantics of most languages)?

-- 
Saumya Debray		CS Department, University of Arizona, Tucson

     internet:   debray@cs.arizona.edu
     uucp:       uunet!arizona!debray

gudeman@cs.arizona.edu (David Gudeman) (01/04/90)

In article  <1782@aipna.ed.ac.uk> sean@aipna.ed.ac.uk (Sean Matthews) writes:
>The Von Neumann machine, which is a sort of sophisticated Turing
>machine,

Hardly.  They are related in the same sense that the lambda calculus
is related to combinatory logic, or in the same way that grammers are
related to predicate logic.

>...It has produced languages such as Fortran, Algol{58,60,68,W}, COBOL,
>C, PL/1 Sumula, Pascal, Modula{2}, C, BCPL, Ada, Mesa, BASIC etc.
>
>And these languages have had such an influence on computer programming
>that most books on programming language design deal only with what to
>me seem the minor differences between them.

True enough...

>I think that this model has no theoretical virtue, all it offers is
>real world efficiency.

I disagree with this.  Algorithms were written long before there were
computers to execute them.  Part of the problem with this view is that
you are not distinguishing between the trivial models of algorithmic
computing (Turing machines and Von Neumann machines) and the
fundamental concept of an algorithm, which I define as: "a description
of a problem solution by systematic changes in a computing device,
based on rules of computation and on the current state of the device,
where the state of the device at termination encodes the problem
solution."  This view of computation is not only theoretically useful,
it is of great practical importance in finding solutions to many
problems that are conveniently solved by such a method.

In fact this method of computation is so convenient and intuitive,
that is is often used as a way of understanding programs written in
other models of computation.  For example, the lambda calculus is
often treated as a rewrite system, and I claim that rewrite systems
are essentially algorithmic in nature.  The same may be said of
grammers, and even predicate logic to a certain extent.

>So what are the advantages that Lambda calculus gives us?  Higher
>order functions,

Algorithmic systems can have higher-order algorithms.

>powerful typing systems,

completely unrelated to the functional/algorithmic difference.

>no side effects (except when we want them)

No side effects _ever_, even when we want them.  Unless you are giving
up the purity of the paradigm.

>compact code

more related to the data abstractions available than to computational model.

> more abstraction.

depends on your definition of abstraction.  If you define "more
abstract" as "less algorithmic", then this is true.

>The elimination of the Von Neumann bottleneck

This presumes that the Von Neumann machine is the basis of all
algorithmic languagess.  In particular it presumes that algorithmic
methods are limited to computation on integers, and that structured
objects must be handled one element at a time.  This is far from true,
and many of the languages you listed as Von Neumann languages present
facilities for treating aggregates at any desired level.  What they
frequently don't have is _primitives_ for treating aggregates as a
whole, but these can be (and usually are) supplied in libraries.

A very large percentage of the criticisms of "convential languages"
made by proponents of other systems are really criticisms of the data
types and primitive operations available in the languages.  For
example, one often sees FP proponents claim that the array data
structure is too limited because of its word-at-a-time operations.
But this criticism (which I agree with) doesn't imply that we should
switch to a functional language with sequences, it implies that we
should introduce sequences into the languages we prefer.
-- 
					David Gudeman
Department of Computer Science
The University of Arizona        gudeman@cs.arizona.edu
Tucson, AZ 85721                 noao!arizona!gudeman

jschon@infmx.UUCP (John Schonholtz) (01/04/90)

In article <1470@mdbs.UUCP> wsmith@mdbs.UUCP (Bill Smith) writes:
>
>Language	Fundamental Concept
>
>Ada		Generics (?)

Not quite!  As anyone who has worked extensively with the language
knows...

Language	Fundamental Concept

Ada		Information Hiding

In fact, the DOD did such a good job here that, when examining
most Ada programs, it is impossible to find out anything.

;-)

-- 
*	John Schonholtz		* It's the end of the world as we know it
*   Informix Software, Inc.     * And I feel fine
*        415/926-6813		* pyramid!infmx!jschon      

ljdickey@water.waterloo.edu (L.J.Dickey) (01/09/90)

In article <1782@aipna.ed.ac.uk> sean@aipna.ed.ac.uk (Sean Matthews) writes:

>I have also listed a section 2a, of combinatory logic, which is where
>we could put languages like FP, and even APL.  FP is clearly
>combinatory, while I would argue that the way that APL works is much
>more influenced by the combinatory style that it encourages than by
>the fact that it works with arrays (and I know that there is
>assignment in APL, but in practice people try to avoid it, and write
>one liners instead, since that is the more powerful, and natural way
>to think in APL, the pity is that it has such a limited set of
>combinators).

Readers are encouraged to track the progress of SAX (Sharp APL for UNIX),
which has introduced some new combinators.  Their vocabulary uses words
like "noun", "verb", "adverb", and "conjunction".  An adverb corresponds
to a mathematical operator because it acts on a function ("verb") and 
returns a function as a result.  Their new adverbs and conjunctions
are (for me) the most interesting combinators.  Some APL users are as
concerned about the proliferation of parentheses as, apparently,
some combinatory logicians were, and so find practical use for the
commute adverb, for instance.

A recent research paper on this topic by some involved one way or another
with SAX, appears in APL Quote Quad, Volume 19, Number 4, August 1989,
on page 197.

-- 
    L. J. Dickey, Faculty of Mathematics, University of Waterloo.
	ljdickey@water.UWaterloo.ca	ljdickey@water.BITNET
	ljdickey@water.UUCP		..!uunet!watmath!water!ljdickey
	ljdickey@water.waterloo.edu	

alan@oz.nm.paradyne.com (Alan Lovejoy) (01/09/90)

What the world needs is a programming language whose "fundamental concept"
("raison d'etre") is abstraction. Period.

"Give me a place to stand on, and with this lever..."

Enough said.

____"Congress shall have the power to prohibit speech offensive to Congress"____
Alan Lovejoy; alan@pdn; 813-530-2211; AT&T Paradyne: 8550 Ulmerton, Largo, FL.
Disclaimer: I do not speak for AT&T Paradyne.  They do not speak for me. 
Mottos:  << Many are cold, but few are frozen. >>     << Frigido, ergo sum. >>

sakkinen@tukki.jyu.fi (Markku Sakkinen) (01/10/90)

In article <6917@pdn.paradyne.com> alan@oz.paradyne.com (Alan Lovejoy) writes:
-What the world needs is a programming language whose "fundamental concept"
-("raison d'etre") is abstraction. Period.
-
-"Give me a place to stand on, and with this lever..."
-
-Enough said.

CLU ?

Markku Sakkinen
Department of Computer Science
University of Jyvaskyla (a's with umlauts)
Seminaarinkatu 15
SF-40100 Jyvaskyla (umlauts again)
Finland

nick@lfcs.ed.ac.uk (Nick Rothwell) (01/10/90)

In article <2886@water.waterloo.edu>, ljdickey@water (L.J.Dickey) writes:
>Readers are encouraged to track the progress of SAX (Sharp APL for UNIX),
>which has introduced some new combinators.  Their vocabulary uses words
>like "noun", "verb", "adverb", and "conjunction".  An adverb corresponds
>to a mathematical operator because it acts on a function ("verb") and 
>returns a function as a result.

Why are "operators" different to functions?

What do you do about full higher-order facilities (e.g. higher-order
functions returning higher-order functions)? "adadadadadverbs"?

>    L. J. Dickey, Faculty of Mathematics, University of Waterloo.

		Nick.
--
Nick Rothwell,	Laboratory for Foundations of Computer Science, Edinburgh.
		nick@lfcs.ed.ac.uk    <Atlantic Ocean>!mcvax!ukc!lfcs!nick
~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~ ~~
  "...all these moments... will be lost in time... like tears in rain."

aarons@syma.sussex.ac.uk (Aaron Sloman) (01/15/90)

wsmith@mdbs.UUCP (Bill Smith) writes:

> Date: 29 Dec 89 19:40:36 GMT
> Organization: MDBS Inc., Lafayette, IN
>
> I have an idea regarding programming language design that I would
> like to toss out in order to see how accurately it describes the
> high level thinking that goes into a new programming language.
>
> The idea is "fundamental concept."  The fundamental concept of
> a language is the data structure, control structure or style issue
> that distinguishes the language and must be understood in detail before
> proficient use of the programming language can begin.
>
POP-11
        Block-structure with multi-bracket syntax, collection of
        abnormal exit procedures (chain, exitfrom, exitto), use of
        "open" stack (Like Forth), "dynamic local expressions",
        lexical and dynamic binding, most kinds of numbers, records,
        vectors, arrays, strings, "words" (in a dictionary)
        procedures as data objects, stream processing via repeaters
        and consumers, list processing including pattern matcher,
        dynamic lists (limited lazy evaluation), lightweight
        processes, properties, class-procedures, syntax extendable
        by code-planting as well as macro expansion, autoloadable
        libraries, OOP subsystem, incremental compiler, screen
        editor as a sub-routine.

        (AIM: an easy language for beginners that "grows" as you
        learn, and remains useful for advanced R&D).

Actually - that list applies only to Poplog Pop-11. Alphapop (Byte
May 1988) and earlier dialects (POP2, POP-10, WPOP) contain a
subset.

All dialects of POP have at least the open stack, dynamic binding
of variables, garbage collector, incremental compilation, records,
arrays, vectors, strings, words, list processing, partial
application.

Also, I have mixed up language features with other features. But
from the point of view of a user there's no sharp boundary.

Aaron Sloman,
School of Cognitive and Computing Sciences,
Univ of Sussex, Brighton, BN1 9QH, England
    EMAIL   aarons@cogs.sussex.ac.uk
            aarons%uk.ac.sussex.cogs@nsfnet-relay.ac.uk
            aarons%uk.ac.sussex.cogs%nsfnet-relay.ac.uk@relay.cs.net
    BITNET: aarons%uk.ac.sussex.cogs@uk.ac
    UUCP:     ...mcvax!ukc!cogs!aarons
            or aarons@cogs.uucp