[comp.software-eng] Soft-Eng Digest V3 #14

MDAY@XX.LCS.MIT.EDU.UUCP (11/02/87)

Soft-Eng Digest             Sun,  1 Dec 87       Volume 3 : Issue  14 

Today's Topics:
                    Usenet Discussions (Overview)
                      Ideal Languages (15 msgs)
----------------------------------------------------------------------
Date: 01 Nov 87 18:12:32 EST
From: Mark S. Day <MDAY@XX.LCS.MIT.EDU>
Subject: Usenet Discussions (Overview)

I have attempted to summarize for the digest a number of messages that
fell out of a discussion of software technology's apparent failings
compared to hardware technology.  One branch of this discussion dealt
entirely with a series of complaints about current languages and their
failings.  These messages have the subject heading "Ideal Languages"
since they have developed quite separately from the other discussions
about software technology.

There are two phenomena of Usenet discussions that I have tried to edit
out.  One is the "me too" message where someone adds a small delta to
the discussion without really making any significant point.  The other
is the "interruption" style of arguing a case, where a previous message
is dissected sentence-by-sentence, with various comments as
interruptions of the original text.  

Contributors can improve their chances of appearing in the digest by
writing focused, concise articles.  Flames and personal attacks are
essentially always flushed. 

--Mark

------------------------------

Date: 25 Oct 87 12:12:32 GMT
From: k.cc.purdue.edu!l.cc.purdue.edu!cik@j.cc.purdue.edu  (Herman Rubin)
Subject: Ideal Languages

Unfortunately, software technology seems concerned only with attempts to do a
highly restricted set of operations.  This has also recently gotten into 
hardware development.  I think that the ideas of the software and hardware
gurus can be likened to producing automobiles which can be programmed to
get you to an address you type in, but will not let you back the car out of
the garage into the driveway.  I suggest that the software developers first
consider the power of existing hardware, next the natural power of hardware
(what unrealized operations can be easily done in hardware but are not now
there), and finally the "current use."  FORTRAN, when it was produced, was
not intended for system subroutine libraries.  There are instruction present
on most machines which someone knowing the machine instructions will want to
use as a major part of a program, but which the HLL developers seem to ignore.

I suggest that a language intended for library development be approached by
the developers with the attitude that a machine cycle wasted is a personal
affront.  I think we will find that the resulting languages will be easier
to use and less artificial than the current ones.  Implementing an arbitrary
set of types is no more difficult for the user than the 5 or 6 that the
guru thinks of.  Allowing the user to put in his operations and specifying
their syntax is not much more difficult for the compiler than the present
situation.  For example, I consider it unreasonable to have any language
which does not allow fixed point arithmetic.  It may be said that this would
slow down the compiler.  However, every compiler I have had access to
is sufficiently slow and produces sufficiently bad code that it would be
hard to do worse.

I suggest that there be a major effort to produce an incomplete language
which is 
        1.  Not burdened by the obfuscated assembler terminology.

        2.  Easily extended by the user.

        3.  Designed with the idea that anything the user wants to do
should be efficiently representable in the extended language.

        4.  Restrictions on the use of machine resources should be as
non-existent as possible, and should be overridable by the user if at
all possible.  The restrictions on register usage in the C compilers
I have seen I consider a major felony.

        5.  Remember that the user knows what is wanted.  I hereby 
execrate any language designer who states "you don`t want to do that"
as either a religious fanatic or sub-human :-).  Those who say that
something should not be allowed because "it might get you into trouble"
I consider even worse.

        6.  Realize that efficient code does not necessarily take longer
to produce than inefficient, and that there are procedures which are not
now being used because the programmer can see that the resources available 
to him will make the program sufficiently slow that there is no point in
doing the programming.

I think that if a reasonable language were produced, we will see that there
will be a new era in algorithm development, and that the hackers will be
competing in producing efficient software.
-- 
Herman Rubin, Dept. of Statistics, Purdue Univ., West Lafayette IN47907
Phone: (317)494-6054
hrubin@l.cc.purdue.edu (ARPA or UUCP) or hrubin@purccvm.bitnet

------------------------------

Date: 27 Oct 87 01:28:17 GMT
From: crowl@cs.rochester.edu  (Lawrence Crowl)
Subject: Ideal Languages

No, software developers should first consider the task to be solved.  The 
efficient use of the hardware is moot if the software does not meet the need.

The task of language developers is not (or should not be) to directly support
the hardware, but to provide an environment in which a programmer can
effectively express the solution to a problem.  In those cases where efficiency
matters, the language model is generally chosen to be efficiently realized on
conventional machines.  Catering to specific instructions on specific machines
is generally a loss because the resulting programs are useful only on that
machine.  Supporting common instructions directly in the language often means
presenting an inconsistent model.  For instance, the arithmetic right shift
provided by many machines provides division by two except when the number is
negative and odd.  Should languages be designed around this quirk?  I do not
think so.

User modifiable syntax is a very difficult to define consistently and very
difficult to parse.  The consensus so far appears to be that it is not worth
the cost.

If the target audience of the language does not need fixed point arithmetic,
the introduction of a fixed point type is counter productive.  How many
compilers have you looked at?  Some produce very good code.  Others are
atrocious.  It is far easier to criticize code generators than to provide a
generator that produces better code.

The user may know what is wanted, but translating that into code is not always
a simple task.  Consider assigning a boolean value to an integer.  Is this
something that "the user knows what he's doing" and the language should accept,
or is it something that "the user doesn't want to do" and may "get him in
trouble"?  Almost always it is not what the user wants.  If it is what the
user wants, the result is usually a non-portable, difficult-to-understand
program.  (The management usually does not want the latter even if the
programmer does.)

Efficient code almost always takes longer to produce than inefficient code.
You must invenst development time to get efficiency.

Algorithm development is independent of any specific language, so a new
language will probably have little affect on algorithms.  Hackers are already
competing in producing efficient software, so a new language will have little
affect here also.

  Lawrence Crowl		716-275-9499	University of Rochester
		      crowl@cs.rochester.edu	Computer Science Department
...!{allegra,decvax,rutgers}!rochester!crowl	Rochester, New York,  14627

------------------------------

Date: 27 Oct 87 03:10:19 GMT
From: defun!shebs@cs.utah.edu  (Stanley T. Shebs)
Subject: Ideal Languages

Sounds like you want Forth.

There are reasons for those [register] restrictions, as is clear from studying one
or two compilers.  Some registers are needed for implementing protocols,
particularly in function calling.  Other registers are used for constant
values (0, 1, -1 are popular).  You should try writing a register allocator
before engaging in name-calling.

By your rationale, every language should include PEEK and POKE, and the
hardware shouldn't generate any of those silly "segmentation violation"
traps.  Lisp systems should allow the programmer to acquire the address
of an object, even if it will be worthless one millisecond later when the
garbage collector kicks in.  I hardly think a responsible language designer
would include a capability that has proven from experience to be a source
of catastrophic and mysterious bugs, especially if the capability itself
is not particularly important.
 
The whole proposal sounds like something out of the 50s or 60s, when debates
raged over the value of "automatic programming systems" (like Fortran!).

It is interesting to note that the proposal omits what is probably the #1
reason for HLLs: PORTABILITY.  It's all well and good to talk about exploiting
the machine, but my tricky code to exploit pipelined floating point operations
on the Vax will be utterly worthless on a Cray.  The prospect of rewriting
all my programs every couple years, and maintaining versions for each sort
of hardware, would be enough to make me go work in another field!

In fact, the view that software should be independent of hardware is one of
the great achievements of software technology.  The battle of abstraction vs
specialization has been going on for a long time, and abstraction has won.
The victory is rather recent; even ten years ago it was still generally 
assumed that operating systems and language implementations had to be written
in assembly language...

							stan shebs
							shebs@cs.utah.edu

------------------------------

Date: 27 Oct 87 15:38:38 GMT
From: k.cc.purdue.edu!l.cc.purdue.edu!cik@j.cc.purdue.edu  (Herman Rubin)
Subject: Ideal Languages

Are we to be limited to those features which are portable to all machines?  If
we do this, the only arithmetic operations we can allow are fairly short inte-
ger arithmetic; of the machines I have looked into in the past few years, no
two have the same floating-point arithmetic.  This includes the CDC6x00, VAX,
CYBER205, CRAY, PYRAMID, IBM360 and its relatives.  And should we use the same
algorithm on different machines?  I, for one, would question the intelligence
of those who would attempt to enforce such restrictions.  The languages should
not be designed around the quirks; they should be powerful enough to enable
the programmer to make use of the quirks.

I am not advocating a language to optimize a specifice machine.  I believe that
a language should be sufficiently powerful that the intelligent programmer can
optimize on whatever machine is being used at the time.  If the instruction is
not there, it cannot be used.  It may be worth replacing, or it may be desira-
ble to completely revise the computational procedure.  After a program is
written, especially a library program, it may be found that quirks in the 
machine cause the program to be too inefficient.  In that case, it is necessary
to think the whole program design over.  A good programmer will soon learn
what is good on a particular machine and what is not.  I can give competitive
algorithms for generating normal random variables on the CYBER205 which I know,
without programming them, will not be competitive on any of the other machines
I have listed above.  These algorithms cannot be programmed in any HLL and be
worthwhile.  (Any machine architecture can be simulated on any other machine
with any sufficiently complex language if there is enough memory, but the
resulting program is not worth attempting.)  There are algorithms which are
clearly computationally very cheap, using only a few simple bit operations,
for which it is obvious that no HLL can give a worthwhile implementation, and
for which it is questionable as to which machines have architectures which
make those algorithms not cost an arm and a leg.

If the syntax is somewhat limited, it will still be very powerful and not so
difficult to parse.  The reason that the typical assembler language is so 
difficult to use is due to the parsing difficulty 35 years ago.  Except for
not using types, Cray's assembler constructs on the CDC6x00 and on the CRAY's
go far in the right direction.  

There is no adequate language for the production
of library subroutines.  If you say that the audience does not exist, then you
are certainly wrong.  If you say that the audience is small, then one could
equally criticize the existence of a Ph.D. program in any field.  I question
the need for a language which will keep the user ignorant of the powers of the
computer.  I also question whether such a language, unless very carefully 
presented as incomplete, with sufficiently many indications of its deficiencies,
will facilitate the eventual enlightenment of the learner; in fact, I believe
that this exemplifies one of the major reasons for the brain-damaged nature of
our youth.  How can a compiler produce good code if it cannot use the instruc-
tions necessarily involved in that code?  If fixed point arithmetic is needed,
the tens of instructions needed to achieve that in a language such as C do not
constitute good code if the hardware is available.  Unfortunately, some 
machines such as the CRAY do not provide decent fixed point multiplication;
on such machines it is necessary to work around it, and one may find it
advisable to totally revise the algorithm.

The user should be able to introduce type definitions (structures in C), new
operators (such as the very important &~, which may or may not compile
correctly), and overload old operators.  This should be very flexible.

The VAX has twelve general-purpose registers available.  If I write a program
which uses eleven registers, I object to the compiler, which does not need any
registers for other purposes, only giving me six.

It is true that the language does not directly affect the algorithm.  However,
someone who considers whether or not there is any point in implementing the
resulting algorithm will necessarily consider the available tools.  If an
algorithm involves many sqare roots, I would be hesitant in using it rather
than a less efficient one which does not unless square root is a hardware
instruction, which it should be but is not on most machines.  The number of
reasonable algorithms is infinite, not merely very large.

Herman Rubin, Dept. of Statistics, Purdue Univ., West Lafayette IN47907
Phone: (317)494-6054
hrubin@l.cc.purdue.edu (ARPA or UUCP) or hrubin@purccvm.bitnet

------------------------------

Date: 28 Oct 87 05:21:52 GMT
From: defun!shebs@cs.utah.edu  (Stanley T. Shebs)
Subject: Ideal Languages

There aren't enough software people in the world to solve each problem in an
individual and idiosyncratic fashion, so instead several tasks are decreed to
be "similar", which really means that some compromises have to be made.
This situation is not unique to software.  For example, bridge designers don't
usually get new and unique rivets for each bridge - instead, they have to order
from a catalog.

Everybody wants every one of their programs to be maximally efficient on all
imaginable hardware, while at the same time spitting on programmers because
it doesn't happen at the press of a couple keys.  It's unfortunate that the
popular culture encourages the belief that hackers can conjure up complicated
programs in a few seconds.  I suspect it even influences people who should
know better.  To quote Fred Brooks, "there is no silver bullet".

I find it interesting that the people who have been programming the longest -
numerical analysts - are exactly those who use dozens of different algorithms
to solve the same problem, each with slightly different characteristics.
Could mean either that multiplicity of algorithms is coming to the rest of
computing soon, or that numerical analysts haven't yet found the correct forms
for their algorithms...

I think you would have a hard time proving that the minor syntactic 
improvements of the Cray assembly language (with which I am familiar)
have any effect at all on productivity, reliability, etc.  After all, Lisp
programmers can be immensely productive even though the assignment statement
has a verbose prefix (!) syntax.

My thesis is most directly concerned with the implementation of Lisp runtime
systems, which have many similarities to Fortran libraries (really).
The basic angle is to supply formal definitions of the desired functionality
and the hardware, then to use an rule-based system to invent and analyze
designs.  Not very intelligent as of yet, but I have hopes...

Of course, this approach involves a couple assumptions:

1. As you say, there is no "adequate" language for the production of library
subroutines.  After poking through the literature, it is pretty clear to me
that no one has *ever* designed a high-level language that also allows direct
access to different kinds of hardware.  There are reasons to believe that such
a beast is logically impossible.  The output of my system is machine-specific
assembly language.

2. I also assume that human coding expertise can be embodied in machines.
What compilers can do today would have amazed assembly language programmers
of 30 years ago.  There is no reason to believe that progress in this area
will encounter some fundamental limitation.  I've seen some recent papers (on
the implementation of abstract data types using rewrite rules) that still seem
like magic to me, so certainly more advances are coming along.  Closer to
reality are some compiler projects that attempt to figure out optimal code
generation using a description of the target machine.  This is extremely
hard, since machine "quirks" are usually more like machine "brain damage".

Take a look at HAL/S, which is a "high-level assembly language" used in the
space shuttle computers.  Allows very tight control over how the machine
code will come out.  In fact, there are probably a few people reading this
group who could offer a few choice comments on it...

What's wrong with Forth?  It can be extended in any way you like, it can be
adapted to specific machine architectures, the syntax is better than raw
assembly language, and its compilers don't do awful things behind the user's
back.  You'll have to be more specific on how it fails.  I agree with you that
C and Prolog cannot always be adapted to machine peculiarities.

No silver bullet...

							stan shebs
							shebs@cs.utah.edu

------------------------------

Date: 28 Oct 87 18:00:21 GMT
From: pioneer!eugene@ames.arpa  (Eugene Miya N.)
Subject: Ideal Languages

In article <5084@utah-cs.UUCP> shebs%defun.UUCP@utah-cs.UUCP (Stanley T. Shebs) writes:
>Take a look at HAL/S, >space shuttle computers.
>In fact, there are probably a few people reading this group who could
offer a few choice comments on it...

You asked:
	Grrrrrr.

From the Rock of Ages Home for Retired Hackers:

--eugene miya, NASA Ames Research Center, eugene@ames-aurora.ARPA
  "You trust the `reply' command with all those different mailers out there?"
  "Send mail, avoid follow-ups.  If enough, I'll summarize."
  {hplabs,hao,ihnp4,decwrl,allegra,tektronix}!ames!aurora!eugene

------------------------------

Date: 28 Oct 87 19:58:02 GMT
From: crowl@cs.rochester.edu  (Lawrence Crowl)
Subject: Ideal Languages

You are still oriented on supporting the hardware instead of describing a
solution.  A real high level language is independent of the word size.  If a
machine does not have a long integer, the implementation of the language must
build it from short integers.

Given that highly accurate floating point routines are heavily dependent on
specific floating point formats, this is a problem.  No high level language
will be portable from machine to machine when the formats are different.  The
IEEE floating point standard should help.

Any language which allows the programmer to make use of the quirks must make
them visible.  This makes the language complex, difficult to understand,
difficult to reason about and makes the resulting programs non-portable.

If you are trying to get 100% performance, you will have to use assembler.
No language that has any meaning beyond a single machine can hope to meet your
needs.  It seems that your real complaint is that assembler languages are too
verbose, not that high level languages do not allow sufficient access to the
machine.  It should not be too hard to whip together an expression-based
"pretty" assembler.

Very few languages are incomplete in the sense of being able to compute an
arbitrary function.  However, assembly languages are examples of incomplete
languages.  All languages are incomplete with respect to their support of
the programmer for some application area.  It is true that there are no
languages tuned to the machine specific implementation of highly optimized
numerical libraries.  I suspect it would look very much like the "pretty"
assembler.

You apparently program in a very narrow world where performance is everything.
The vast majority of programming is not done in such a world.

C++ allows programmers to introduce new types and overload existing operators.
It does not allow the definition of new operators, but does allow definition of
functions which achieve the same effect at the cost of a slightly more verbose
syntax.

Effient code takes longer because the algorithms are generally more complex,
not becuase the languages are bad.


  Lawrence Crowl		716-275-9499	University of Rochester
		      crowl@cs.rochester.edu	Computer Science Department
...!{allegra,decvax,rutgers}!rochester!crowl	Rochester, New York,  14627

------------------------------

Date: 29 Oct 87 15:56:06 GMT
From: decvax!necntc!culdev1!drw@ucbvax.Berkeley.EDU  (Dale Worley)
Subject: Ideal Languages

crowl@cs.rochester.edu (Lawrence Crowl) writes:
| For instance, the arithmetic right shift
| provided by many machines provides division by two except when the number is
| negative and odd.  Should languages be designed around this quirk?  I do not
| think so.

This is not so simple...  What do you mean by "division by 2"?  There
are actually at least two ways of doing integer division:  one way is
to always round the mathematical quotient towards 0.  This is the
common, or "Fortran", way to do it, but to a certain extent it is
historical accident that most programming languages do it this way.
Of course, this is *not* what arithmetic right shift does.

But this method is not necessarily the most natural way to do
division.  In many cases, this is more natural:  round the
mathematical quotient *down*, that is, more negative.  This is
equivalent to the common way for positive quotients, but is different
for negative quotients:  (-1)/2 = -1.

(If you think this method is silly, consider the problem:  I give you
a time expressed in minutes before-or-after midnight, 1 Jan 1980.
Find the hour it occurs in, expressed in hours before-or-after
midnight, 1 Jan 1980.  The advantage of the second division here is
that the quotient is computed so the remainder is always positive.)

I have had to write functions in several programming languages to
perform this second form of division.

Dale
-- 
Dale Worley    Cullinet Software      ARPA: culdev1!drw@eddie.mit.edu
UUCP: ...!seismo!harvard!mit-eddie!culdev1!drw
If you get fed twice a day, how bad can life be?

------------------------------

Date: 29 Oct 87 11:21:01 GMT
From: k.cc.purdue.edu!l.cc.purdue.edu!cik@j.cc.purdue.edu  (Herman Rubin)
Subject: Ideal Languages

The job of the software developer is to produce the flexible
tools so that the thinking brain can do the job.  I think that this can be done
by producing typed, but not stongly typed, assemblers with a reasonable syntax
(for example, instead of using
	OPMNEMONIC_TYPE_3	z,y,x
on the VAX, if x, y, and z are of type TYPE I want to be able to write
	x = y OPSYM z
where OPSYM is the operation symbol (+,-,*,etc.)

The user should also be able to produce macros in the same format.  The recent
discussion of 16 bit * 16 bit = 32 bit is an example of this.

I suggest that this means that numerical analysts realize that the algorithm to
use depends on the circumstance.  A reasonable driver uses dozens of different
algorithms in a single trip.

Neither is if ... then...else.  Both should be avoided on those machines, and
replaced by different procedures.  Again, non-portability.  There is consider-
able effort going into designing algorithms which can take advantage of
parallelism and vectorization.  For many purposes, I would not even consider
using the same algorithm on the VAX and the CYBER205.  There are good vector
algorithms on the 205 which would be ridiculous on the CRAY, which is also a
vector machine.

Herman Rubin, Dept. of Statistics, Purdue Univ., West Lafayette IN47907
Phone: (317)494-6054
hrubin@l.cc.purdue.edu (ARPA or UUCP) or hrubin@purccvm.bitnet

------------------------------

Date: 30 Oct 87 05:49:23 GMT
From: munnari!moncskermit!moncsbruce!conybear@uunet.uu.net  (Roland Conybeare)
Subject: Ideal Languages

	I feel frustrated when I can imagine simple, efficient machine-level
code to solve a problem, but I cannot get my HLL of choice to produce it.

	For example,  suppose I want to allocate space on the stack,  but I don't
know how much space will be required until runtime (this is very important
if I want to implement function calls efficiently in an interpreter,  or
write a B-tree where the user provides the size of the elements to be stored
in each B-tree).  
	How do I proceed?  If the language does not provide this facility and
gives me no way around its restrictions, then I must find another
solution within the language, and this may be a high price to pay (in a lisp
interpreter, the price might be allocating N cons-cells from the heap on
every function call; in a B-tree, allocating 1 B-tree node from the heap for
each activation of the add() procedure).
	On the other hand,  if the language does allow me to turn off restrictions
(e.g. in-line assembler, type casting), then I can have a non-portable
but efficient solution. 

	I believe that it is *my* business, not the language designer's, to 
choose the tradeoff between efficiency and portability when the language
does not provide a solution to my problem.

	I feel that many modern computer languages suffer from *hubris* on the part
of the language designer(s).  Like everybody, language designers make mistakes,
and I have yet to see a 'perfect' language.  I do not expect the 
designer to anticipate all the features I might want in their language.  I
do mind when the designer says "take my language and submit to it's rules,
or write in assembler".  I am most productive when I can submit to a
language's restrictions where those restrictions help me avoid errors;
and disable such restrictions, with care, when they obstruct "the best"
solution to a problem.

	It is generally agreed nowadays that languages should be extensible.  I
propose that we should strive to produce languages that are extensible in
as many different directions as possible, since every language feature which
is fixed places a restriction on the programmer and the power of the language.
	For example, LISP traditionally can be extended by adding functions,
however just a few types (ints, strings, cons cells) are built into
the language and this set is not extensible.
	In Pascal or C, I can define types and functions, but not operators.
I cannot generally define constants in the types I define.

	I also propose that our compilers should be extensible.  My ideal compiler
would look like one of today's compilers, but decomposed into a set of modules,
with explicit interfaces.  Some modules would describe how to compile builtin
types.  If I did not like the compiler's code generation for floating point
numbers,  I could reimplement the appropriate module.  If I felt that complex
numbers were essential, I could implement them.  While much of this
can be done using user-defined types and functions, the important thing is
that if I have control over my types' compilation then I also control their
implementation.
	Furthermore,  I can develop an application quickly using whatever
compilation methods are available,  and optimise it later.  For example, I
might write a module to provide variable automatic allocation at run-time
using the heap, and later alter it to use the stack.

Roland Conybeare
(conybear@moncsbruce.oz)

------------------------------

Date: 30 Oct 87 23:09:46 GMT
From: defun!shebs@cs.utah.edu  (Stanley T. Shebs)
Subject: Ideal Languages

In article <326@moncsbruce.oz> conybear@moncsbruce.oz (Roland Conybeare) writes:

>	I feel that many modern computer languages suffer from *hubris* on the part
>of the language designer(s).  Like everybody, language designers make mistakes,
>and I have yet to see a 'perfect' language.  I do not expect the 
>designer to anticipate all the features I might want in their language.  I
>do mind when the designer says "take my language and submit to it's rules,
>or write in assembler".

Of course, everyone is free to design their own languages.  It's also the
case that programmers are free to use whatever language is available, unless
they are working in some kind of preexisting and restrictive context (DoD,
maintenance of old code, etc).  Making the language implementation available
is just a matter of programming, using well-known techniques.  Therefore,
any moaning and complaining about languages must issue from people who are
unwilling to do the work themselves, and who think that there is a whole crowd
of language specialists who have nothing better to do, and must be coerced
into thinking the "right" way.

There's nothing to stop anyone from introducing a new and wildly popular
language, and I say to them: go for it!

>	I also propose that our compilers should be extensible.  My ideal compiler
>would look like one of today's compilers, but decomposed into a set of modules,
>with explicit interfaces.  Some modules would describe how to compile builtin
>types.  If I did not like the compiler's code generation for floating point
>numbers,  I could reimplement the appropriate module.  [...]

This is a very good idea, and can be found in the latest Lisp compilers.
Unfortunately, it's tricky to use, and not documented very well.  There is
still a lot of theory needed to get a certain level of generality;  without
that basis, users would moan and complain about all the restrictions placed
on extensions to the compiler.  For instance, getting register allocation to
work correctly in the presence of user-supplied assembly code is tricky...
Look for some amazing compilers in about 10-20 years, maybe less if the work
ever gets funded adequately.  (Compilers are not a fashionable topic at the
moment, sigh.)


							stan shebs
							shebs@cs.utah.edu

------------------------------

Date: 30 Oct 87 23:10:46 GMT
From: mcvax!enea!sommar@uunet.uu.net  (Erland Sommarskog)
Subject: Ideal Languages

conybear@moncsbruce.oz (Roland Conybeare) writes:
>   For example,  suppose I want to allocate space on the stack,  but I don't
>know how much space will be required until runtime (this is very important
>if I want to implement function calls efficiently in an interpreter,  or
>write a B-tree where the user provides the size of the elements to be stored
>in each B-tree).  
>   How do I proceed?  If the language does not provide this facility and
>gives me no way around its restrictions, then I must find another
>solution within the language, and this may be a high price to pay 

If this is important to you, use a language which helps you do this.
Assembler if you can live without range checking. Perhaps you can do it
in C. (I don't speak C, so I don't know)
  If you prefer programming in a controlled way, languages like Simula 
and Ada premit declaring arrays with the size set at run-time. Whether 
you will get an efficient program, using these languages depends on the 
*compiler*, not the language itself.

Seems like some people have all time in the world. But if you are that
lucky, why don't you write your own compiler? The notion of a user-
modifyable compiler just gives me headaches having to read the manual 
for it. And must say that I believe that we programmers would be
more productive than all wasting our time modifying the same compiler.


-- 
Erland Sommarskog       
ENEA Data, Stockholm    
sommar@enea.UUCP        
                    It could have been worse; it could have been Pepsi.

------------------------------

Date: 30 Oct 87 14:39:29 GMT
From: mcvax!enea!ttds!draken!sics!pd@uunet.uu.net  (Per Danielsson)
Subject: Ideal Languages

In article <5079@utah-cs.UUCP> shebs%defun.UUCP@utah-cs.UUCP (Stanley T. Shebs) writes:
>By your rationale, every language should include PEEK and POKE, and the
>hardware shouldn't generate any of those silly "segmentation violation"
>traps.  Lisp systems should allow the programmer to acquire the address
>of an object, even if it will be worthless one millisecond later when the
>garbage collector kicks in.  I hardly think a responsible language designer
>would include a capability that has proven from experience to be a source
>of catastrophic and mysterious bugs, especially if the capability itself
>is not particularly important.

Lisp machines allows precisely that. It is a quite necessary part of
the system. Of course the user will have to know exactly what he is
doing when using the facility, and most users seldom have any need for
it, but it has to be there.
-- 
Per Danielsson          UUCP: {mcvax,decvax,seismo}!enea!sics!pd
Swedish Institute of Computer Science
PO Box 1263, S-163 13 SPANGA, SWEDEN
"No wife, no horse, no moustache."

------------------------------

Date: 29 Oct 87 13:04:12 GMT
From: cbosgd!clyde!watmath!utgpu!utzoo!yetti!geac!daveb@ucbvax.Berkeley.EDU
Subject: Ideal Languages

  Well, C.R. Spooner wrote an article in '86 entitled "The ML
Approach to the Readable All-Purpose Language", in the ACM
Transactions on Programming Languages and Systems for April of 1986.

  This might easily fill the bill, although their origonal
(slightly kludgy) implementation probably wouldn't.  Details
available on request.

 --dave
-- 
 David Collier-Brown.                 {mnetor|yetti|utgpu}!geac!daveb
 Geac Computers International Inc.,   |  Computer Science loses its
 350 Steelcase Road,Markham, Ontario, |  memory (if not its mind)
 CANADA, L3R 1B3 (416) 475-0525 x3279 |  every 6 months.

End of Soft-Eng Digest
******************************

-------