[comp.software-eng] Algebra .vs. Adjectives and Adverbs

xtbjh@levels.sait.edu.au (behoffski) (05/01/91)

I would like raise a topic for discussion based on an idea that I had 
quite some time ago.  The idea is very simple but allows some quite 
interesting improvements to language design and implementation.  

The idea is this: programming languages that were originally derived 
from algebra (Pascal, ADA, C, FORTRAN etc), suffer from algebra's bias 
towards nouns and verbs over adjectives and adverbs.  This deficiency 
is especially noticeable when trying to build interfaces between 
system components.  

For example, consider an interface to a software machine (or module, 
if you prefer) that implements binary trees.  The definition of the 
tree includes the "root" node and "leaf" nodes.  Now consider a 
request to find all the leaf nodes of the tree.  If you only have 
nouns and verbs, then the program will be something along the lines of:

        list.clear(list)
        tree.BeginEnumeration(this_tree, context)
        while tree.GetNextNode(context, this_node) do
                if tree.IsLeaf(this_node) then
                        list.add(list, this_node)
                end if
        end for

The simple adjective, leaf, has been turned into the verb IsLeaf.  
This has the unpleasant property of turning a single request, which 
is highly parallel, into a large number of sequential requests.  
The two biggest disadvantages of this result are:
        - in order to exploit parallel processing, a compiler must 
            reverse-engineer the original request, and
        - the interface for the machine "tree" must be highly 
            efficient since there will typically be many operations.

Another nice property of adjectives is that they allow interfaces that 
scale very nicely between very small machines and very large machines.  
For example, assuming fairly well-balanced trees, the number of leaf 
nodes will be roughly half the nodes in the tree.  

It is interesting that adjectives and adverbs do appear in software 
mainly in two situations:
        - as command line switches, where non-positional switches 
            correspond to adverbs and positional switches correspond 
            to adjectives, and
        - in various contexts within graphical interfaces (e.g. clicking 
            a node in a tree selects all nodes beneath that node).  

Another advantage of adjectives and adverbs is that they allow an 
interface to be more natural and self-documenting.  The resulting code 
can look much like executable pseudocode.  The interface will usually 
be more general since each language component will need to be implemented 
independently of any other component.  

The reusability of machines defined with adjectives and adverbs is also 
sharply higher.  This is because modules remain separate, instead of being 
continually mingled in loops like the example above.  Another benefit is 
that edge cases, that often feature in off-by-one bugs, are greatly 
reduced since most are handled within adjectives or adverbs.  

One of the most promising areas opened up by this idea is allowing 
performance improvements in machine implementation without touching 
the original specification.  For example, if a particular combination 
of adjectives and/or nouns and/or adverbs and/or verbs was very common 
in a program (e.g. one-legged Australian programmers immediately shoot), 
then the machine implementation could add an optimised program for that 
special case.  This could be true even where the modifiers applied to 
a noun or verb originally came from different machines.  

I apologise for the lots of hazy edges in the discussion above.  Despite 
trying to define a language based on this idea for over a year, there are 
lots of grey areas in syntax and implemetation that I haven't been able 
to tie down without lots of compromises.  I have, of course, already 
decided on the name when it finally appears: Languish (puns intended).  

--
Brenton Hoff (behoffski) | Senior Software Engineer | My opinions are mine
xtbjh@levels.sait.edu.au | AWA Transponder          | (and they're weird).

xtbjh@levels.sait.edu.au (behoffski) (05/07/91)

[Apologies if this is a repost.  The local machine had a disk failure 
at the same time as I first tried to post this.  bjh]

I would like raise a topic for discussion based on an idea that I had 
quite some time ago.  The idea is very simple but allows some quite 
interesting improvements to language design and implementation.  

The idea is this: programming languages that were originally derived 
from algebra (Pascal, ADA, C, FORTRAN etc), suffer from algebra's bias 
towards nouns and verbs over adjectives and adverbs.  This deficiency 
is especially noticeable when trying to build interfaces between 
system components.  

For example, consider an interface to a software machine (or module, 
if you prefer) that implements binary trees.  The definition of the 
tree includes the "root" node and "leaf" nodes.  Now consider a 
request to find all the leaf nodes of the tree.  If you only have 
nouns and verbs, then the program will be something along the lines of:

        list.clear(list)
        tree.BeginEnumeration(this_tree, context)
        while tree.GetNextNode(context, this_node) do
                if tree.IsLeaf(this_node) then
                        list.add(list, this_node)
                end if
        end while

The simple adjective, leaf, has been turned into the verb IsLeaf.  
This has the unpleasant property of turning a single request, which 
is highly parallel, into a large number of sequential requests.  
The two biggest disadvantages of this result are:
        - in order to exploit parallel processing, a compiler must 
            reverse-engineer the original request, and
        - the interface for the machine "tree" must be highly 
            efficient since there will typically be many operations.

Adjectives allow construction of interfaces that scale neatly between very 
small machines and very large machines.  For example, assuming fairly 
well-balanced trees, the number of leaf nodes will be roughly half the 
nodes in the tree.  

It is interesting that adjectives and adverbs do appear in software 
mainly in two situations:
        - as command line switches, where non-positional switches 
            correspond to adverbs and positional switches correspond 
            to adjectives, and
        - in various contexts within graphical interfaces (e.g. clicking 
            a node in a tree selects all nodes beneath that node).  

Another advantage of adjectives and adverbs is that they allow an 
interface to be more natural and self-documenting.  The resulting code 
can look much like executable pseudocode.  The interface will usually 
be more general since each language component will need to be implemented 
independently of any other component.  

The reusability of machines defined with adjectives and adverbs is also 
sharply higher.  This is because modules remain separate, instead of being 
continually mingled in loops like the example above.  Another benefit is 
that edge cases, that often feature in off-by-one bugs, are greatly 
reduced since most are handled within adjectives or adverbs.  

One of the most promising areas opened up by this idea is allowing 
performance improvements in machine implementation without touching 
the original specification.  For example, if a particular combination 
of adjectives and/or nouns and/or adverbs and/or verbs was very common 
in a program (e.g. one-legged Australian programmers immediately shoot), 
then the machine implementation could add an optimised program for that 
special case.  This could be true even where the modifiers applied to 
a noun or verb originally came from different machines.  

I apologise for the lots of hazy edges in the discussion above.  Despite 
trying to define a language based on this idea for over a year, there are 
lots of grey areas in syntax and implemetation that I haven't been able 
to tie down without lots of compromises.  I have, of course, already 
decided on the name when it finally appears: Languish (puns intended).  

--
Brenton Hoff (behoffski)  | Senior Software Engineer | My opinions are mine
xtbjh@Levels.UniSA.edu.au | AWA Transponder          | (and they're weird).

Nick_Janow@mindlink.bc.ca (Nick Janow) (05/07/91)

I suggest that you read "Thinking Forth" by Leo Brodie.  Forth's postfix
notation makes adverbs and adjectives (natural language modifying words) easy
to implement.  FORTH also allows applications to use near-natural language.
For instance, LIGHT BLUE BOX might add a light blue box to your screen.  LEFT
QUICKLY MOVE  DOWN SLOWLY MOVE might control a robot arm.

Take a look at Forth; it's a good language for writing other languages.

windley@ted.cs.uidaho.edu (Phillip J. Windley) (05/08/91)

In article <16281.281f708f@levels.sait.edu.au> xtbjh@levels.sait.edu.au (behoffski) writes:

   I would like raise a topic for discussion based on an idea that I had 
   quite some time ago.  The idea is very simple but allows some quite 
   interesting improvements to language design and implementation.  

   The idea is this: programming languages that were originally derived 
   from algebra (Pascal, ADA, C, FORTRAN etc), suffer from algebra's bias 
   towards nouns and verbs over adjectives and adverbs.  This deficiency 
   is especially noticeable when trying to build interfaces between 
   system components.  

   [... much deleted ...]

Your thesis is intriguing, however, could you give a code segment using
adjectives and adverbs to help those of us with small imaginations.  You
gave a code segment showing what's wrong with exising languages.  Can you
post the same code using your scheme?
--
Phil Windley                          |  windley@cs.uidaho.edu
Assistant Professor		      |  windley@cheetah.cs.uidaho.edu
Department of Computer Science        |
University of Idaho                   |  Phone: 208.885.6501  
Moscow, ID 83843                      |  Fax:   208.885.6645

wdr@wang.com (William Ricker) (05/09/91)

windley@ted.cs.uidaho.edu (Phillip J. Windley) writes:
>In article <16281.281f708f@levels.sait.edu.au> xtbjh@levels.sait.edu.au (behoffski) writes:



>>...The idea is this: programming languages that were originally derived 
>>   from algebra (Pascal, ADA, C, FORTRAN etc), suffer from algebra's bias 
>>   towards nouns and verbs over adjectives and adverbs.  This deficiency 
>>   is especially noticeable when trying to build interfaces between 
>>   system components.   ...

>Your thesis is intriguing, however, could you give a code segment using
>adjectives and adverbs to help those of us with small imaginations.  ...

Take a look in comp.lang.apl.  The new language J, by Iverson Software Inc
(yes, that Ken Iverson) has a regularized grammar modeled on Verbs,Nouns,
Adverbs/adjectives, Conjunctions -- rather than Functors, Operators, Things.
Refernces ACM SigAPL QuoteQuod a/k/a Proceedings APL'90, Iverson & Hui, 
"APL\?". 
    The recent discussion in c.l.a has focussed on some strange things
some users wanted to do (sort each column of an array separately!), which
were simple to express in J (although no one understood why he'd want to,
thus showing it was not an intended ability), while the latest & greatest
traditional APL (IBM APL\2 & the various time-shared APLs) had great 
difficulty.

Disclaimer: I downloaded J from Waterloo, but haven't sent in my share-
ware fee yet.  If you're budget is better than mine, please be a better
citizen!
-- 
/s/ Bill Ricker                wdr@wang.wang.com 
"The Freedom of the Press belongs to those who own one."
*** Warning: This account is not authorized to express opinions. ***