[comp.arch] Universal OS

jesup@pawl18.pawl.rpi.edu (Randell E. Jesup) (04/23/88)

In article <29301@amdahl.uts.amdahl.com> chuck@amdahl.uts.amdahl.com (Charles Simmons) writes:
>In article <21883@bu-cs.BU.EDU> bzs@bu-cs.BU.EDU (Barry Shein) writes:
>>Computers exist within a techno-economic framework. Why doesn't
>>someone build the ultimate frob panel that will work as well for an
>>oscilloscope as a microwave oven or synchrotron? Is this a reasonable
>>question?

>First, I would question whether the original poster really
>cares about having a single machine implementation and a single
>instruction set.  My claim is that most people are not directly
>exposed to these aspects of a computer system and that they
>could really care less.

	Usually this is quite correct, most users only care if it runs the
software they want to use and how fast it does this.

>I see no inherent reason why standard
>versions of an operating system and language cannot and should
>not exist, except for the fact that no one has yet designed
>the perfect OS and language.

	The perfect OS is like the perfect car, or the perfect man/woman, or
....  What is "perfect" is HIGHLY dependant on ones point of view.  Hardware
plays a BIG role in OS design, what types of applications will be run under
it, what the resource constraints are, etc.  Even what should be considered
part of the "OS" is a highly debated point (see BSD vs SysV wars).

	This isn't to say that there is some amount of concensus (usually)
on the major features of a good OS, but even these are dependant on the use
it will be put to.  Note the differences between OS's of imbedded controllers,
multi-user computers, individual workstations, Lisp Machines used for AI, and
airline mainframes.  There are reasons for the differences.

	I won't even think of touching the perfect language question.  I
don't want a mailbox with 50Meg of mail in it.  :-)

     //	Randell Jesup			      Lunge Software Development
    //	Dedicated Amiga Programmer            13 Frear Ave, Troy, NY 12180
 \\//	beowulf!lunge!jesup@steinmetz.UUCP    (518) 272-2942
  \/    (uunet!steinmetz!beowulf!lunge!jesup) BIX: rjesup
(-: The Few, The Proud, The Architects of the RPM40 40MIPS CMOS Micro :-)

gillies@uiucdcsp.cs.uiuc.edu (04/24/88)

You're right about the "perfect OS".  As we see computer controlling
and serving in toasters, cars, banks, schools, airplanes, satellites,
telephone systems, etc, is there any reason to believe they can all be
the same?

Then why don't we just replace toasters, cars, banks, schools, airplanes,
satellites, and telephone systems by "widgets" that perform all these
tasks together?

I hope that's a convincing argument for a layman in computer science.

Don Gillies {ihnp4!uiucdcs!gillies} U of Illinois
            {gillies@p.cs.uiuc.edu}

edw@IUS1.CS.CMU.EDU (Eddie Wyatt) (04/25/88)

 >>First, I would question whether the original poster really
 >>cares about having a single machine implementation and a single
 >>instruction set.  My claim is that most people are not directly
			 	      1. ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
 >>exposed to these aspects of a computer system and that they
   ^^^^^^
 >>could really care less.
 > 
 > 	Usually this is quite correct, most users only care if it runs the
 > software they want to use and how fast it does this.

  Not really true.  When every you have heterogeneous enviroment, the
user must be concerned.  The is a whole hierarchy of interfacing.

	1. data representation - important when data is shared between
		hosts (is there work needed in transforming the data?)
	2. object code - sun-4 objects are not compatible with sun-3 and
		hence two versions of the objects must be kept.
	3. OS - what is the interface to the signal handler under
		SYS V and BSD!

-- 

Eddie Wyatt 				e-mail: edw@ius1.cs.cmu.edu

prh@actnyc.UUCP (Paul R. Haas) (04/26/88)

The naive user doesn't want a universal OS, she wants to be able to
use the same interface everytime she tackles the same problem.  This
means she is happy if the interface on her automatic teller machine
is similar enough to the machine that sells her airline tickets for
her to use both without undo confusion or excessive relearning time.
She also wants to be able drive any rental car without having to learn
how to drive again.  She does not want or expect the rental car to
have the same interface as an automatic teller.

The same should be true of computers.  A user should be able to use
the same programming language(s), the same command language(s)
(shell(s)) and the same window system(s) on all the computers that
she uses.  The choice of language should be based on the problem to
be solved and the person implimenting the solution.

In the article that started this thread, the user said he wanted VMS
on every machine that he used.  I suspect, he would be happy if DCL
(VMS's command shell) was available on those machines.  This is not
likely to happen because of marketing considerations (among others). 

If we can get new users to use X windows (or other widely available
interface) and to use only the widely available features of a widely
available language (such as C or Fortran avoiding nasty extensions)
then they will be able to switch between different operating systems
without a lot of grief.

This will not limit progress.  We can still learn new things, however,
we will have to continue to support the old.  I suspect that any
successfull new operating systems will be able to run programs developed
for Unix with out having to change the source code.  Likewise new
window systems will be able to support existing X window applications.
The new operating systems and new window systems may make some things
very easy which use to be nearly impossible under the old systems.
---------------------------------------------------------------------
Paul Haas	uunet.uu.net!actnyc!prh

eugene@pioneer.arpa (Eugene N. Miya) (04/26/88)

In article <1520@pt.cs.cmu.edu> edw@IUS1.CS.CMU.EDU (Eddie Wyatt) writes:
> >>First, I would question whether the original poster really cares. . .
> >>My claim is that most people are not directly
> >>exposed to these aspects of a computer system and that they
> >>could really care less.
> > 	Usually this is quite correct, most users only care if it runs the
> > software they want to use and how fast it does this.
>  Not really true.  When every you have heterogeneous enviroment, the
>user must be concerned.  The is a whole hierarchy of interfacing.

You make an excellent point here about hetergeneous systems, and this
is what the issue of `transparent systems' should be about.  Concern is
one thing, but getting work done is another.  I give you an example from
personal experience using your own model:

>	1. data representation - important when data is shared between
>		hosts (is there work needed in transforming the data?)

In 1973, I was learning to get around on the ARPAnet. I learned how to use
this command called "FTP" on our IBM 360/75 to get source files from
a distant machine, something called, a "DEC-10"  ;-).  The Fortran had
'#' for '='.  This was an anoyance.  Text editors were few at the time,
and the changes minor, but the point is I had to do the changes.  I should not
have to do even this minor of the changes: how, for instance would I know
that I got them all?  Especially if the program is big.

>	2. object code - sun-4 objects are not compatible with sun-3 and
>		hence two versions of the objects must be kept.
>	3. OS - what is the interface to the signal handler under
>		SYS V and BSD!

Good points.  More fundamental questions could be made about I/O.
I suggest Habermann's CACM paper on FAMOS (Families of Operating Systems)
and Amdahl's paper on the 360/370 architecture as compatibility lessons.
Note, I we are diverging from architectures and I have suggested follow-ups
to comp.os.research.

Another gross generalization from

--eugene miya, NASA Ames Research Center, eugene@ames-aurora.ARPA
				soon to be aurora.arc.nasa.gov
at the Rock of Ages Home for Retired Hackers:
  "Mailers?! HA!"
  {uunet,hplabs,hao,ihnp4,decwrl,allegra,tektronix}!ames!aurora!eugene
  "Send mail, avoid follow-ups.  If enough, I'll summarize."

guy@gorodish.Sun.COM (Guy Harris) (04/26/88)

> 	3. OS - what is the interface to the signal handler under
> 		SYS V and BSD!

This particular point really continues a discussion started in comp.lang.c.
The discussion was inspired by some misconceptions about UNIX signal handlers.

Any rational version of UNIX will permit signal handlers to take a single
argument that is the signal number.  Any version that obliges signal handlers
to be written to take more than one argument will cause *so* much
already-written code to break that it can fairly well considered to be broken
itself.

Fortunately, 4BSD has, as far as I know, only been implemented only on systems
where the extra "signal code" argument that it passes doesn't make a
difference.  As such, it is not (yet) broken in that sense.

ANSI C specifies that signal handlers take one and only one argument.  This
also happens to break existing code; however, assuming that there are machines
that cannot support "varargs" without "varargs" functions being explicitly
declared as such, they had no choice.

In other words, if you want to write portable code, write it so that signal
handlers take one argument.  This will work on any non-broken UNIX system.
Thus, the issue of the interface to the signal handler is not an issue raised
by multiple versions of UNIX.  It *is* an issue if you have non-UNIX systems,
but if you have to deal with non-UNIX systems, it's probably one of the least
of the issues you will have to deal with.

As for the relevance of this point to "most people" and "most users", the
people mentioned in the original article:  to borrow a phrase I have heard
attributed to Frank Zappa, "most people" or "most users" wouldn't know a signal
handler if it came up and bit them in the ass.

cik@l.cc.purdue.edu (Herman Rubin) (04/26/88)

In article <843@actnyc.UUCP>, prh@actnyc.UUCP (Paul R. Haas) writes:

> The same should be true of computers.  A user should be able to use
> the same programming language(s), the same command language(s)
> (shell(s)) and the same window system(s) on all the computers that
> she uses.  The choice of language should be based on the problem to
> be solved and the person implimenting the solution.

Even when compilers are more intelligent than 99% of all programmers, this
is not true.  I believe that a fair language can be developed which is close
to universal, but it will have to be almost totally flexible.  However, even
the algorithm to solve a particular problem may have to be highly machine
dependent for any remotely reasonable efficiency.  The existence of a single
hardware instruction may affect the choice of algorithms greatly.  Thus, the
user needs to know about how machine capabilities affect performance.

We should now be striving for that flexibility.  It will help, but will not
solve the problem.  The language, shell, window, editor, etc. developers
should try to find out everything that a programming genius would consider
including (and do not rely on what one genius wants; ask everybody) and try
to include it _all_.  In addition, the (whatever) should be designed so that
any used can expand it easily, because I can not tell you today about the
feature which I will consider "obviously" needed tomorrow.
to find out ev

-- 
Herman Rubin, Dept. of Statistics, Purdue Univ., West Lafayette IN47907
Phone: (317)494-6054
hrubin@l.cc.purdue.edu (ARPA or UUCP) or hrubin@purccvm.bitnet

radford@calgary.UUCP (Radford Neal) (04/28/88)

In article <762@l.cc.purdue.edu>, cik@l.cc.purdue.edu (Herman Rubin) writes:

> ... The language, shell, window, editor, etc. developers
> should try to find out everything that a programming genius would consider
> including (and do not rely on what one genius wants; ask everybody) and try
> to include it _all_.  In addition, the (whatever) should be designed so that
> any used can expand it easily, because I can not tell you today about the
> feature which I will consider "obviously" needed tomorrow.
> 
> -- 
> Herman Rubin, Dept. of Statistics, Purdue Univ., West Lafayette IN47907


GAK! Have you ever _tried_ designing, and implementing, a language, 
editor, etc.? The last thing you want to do is include everything
anybody has ever thought might possibly be a good idea. Even the 
designers of Ada occasiionally realized this, and left out, for example,
SNOBOL-style pattern matching (how could they? its so powerfull...)
and the ability to write programs in the style of Backus's FP (What? but
that's the wave of the future...).

As for "extensibility", it is much over-rated. Somehow, it always seems
that the really useful modifications are beyond the capabilities of the
extension mechanism. Quick, how can I write a Mock Lisp extension to
emacs to let it handle proportionally-spaced fonts? I don't think it's
even possible to fix the brain-damaged way it does auto-indent... To
exaggerate slightly, the main purpose of extensibility in systems is to
give the designer an excuse (well, if you don't like that, why don't you
change it yourself? ... What? I though everyone knew Lisp...).

A universal operating system will be designed when someone very clever,
imaginative, and artistic creates a reasonably simple model of computer 
use that encorportates the needs of all users and is economically
implementable in the technology of the day. If this should ever occur,
the universal model will be much more likely to encorporate _none_ of
the ideas of past "programming geniuses" than to encorporate them all.

    Radford Neal

piet@ruuinf.UUCP (Piet van Oostrum) (04/29/88)

Posting-Front-End: GNU Emacs 18.47.9 of Mon Mar 21 1988 on ruuinf (hcx/ux)


In article <1556@vaxb.calgary.UUCP> radford@calgary.UUCP (Radford Neal) writes:

   A universal operating system will be designed when someone very clever,
   imaginative, and artistic creates a reasonably simple model of computer 
   use that encorportates the needs of all users and is economically
   implementable in the technology of the day. If this should ever occur,
   the universal model will be much more likely to encorporate _none_ of
   the ideas of past "programming geniuses" than to encorporate them all.

How about a Turing Machine?

-- 
Piet van Oostrum, Dept of Computer Science, University of Utrecht
Budapestlaan 6, P.O. Box 80.012, 3508 TA Utrecht, The Netherlands
Padualaan 14, P.O. Box 80.089, 3508 TB Utrecht (after May 11)
Telephone: +31-30-531806              UUCP: ...!mcvax!ruuinf!piet

franka@mmintl.UUCP (Frank Adams) (04/29/88)

In article <762@l.cc.purdue.edu> cik@l.cc.purdue.edu (Herman Rubin) writes:
>The language ... should ... try to include it _all_.

The problem with this is that the more things you throw into a language, the
harder it is to write a compiler for it.  And harder, when it comes to
software, means it takes longer.

Suppose we define this super-duper, all inclusive language.  Now a new
machine comes along.  Two years after its introduction, we want to run our
program on it.

In case 1, the program was written in some relatively simple language,
perhaps C.  The compiler writers got a compiler working in 6 months; they
have now had 18 months to optimize the output it generates.

In case 2, the program was written in our all-inclusive language.  In this
case, two years is barely enough time to write the compiler.  No
optimization has been done at all.  Furthermore, there are probably bugs in
the less used features of the language; if our program takes such features
into account, it will likely not work.

The result: although we have taken advantage of language features which
theoretically give us, say, a 10% speed improvement, the lack of
optimization makes the program run 30% slower.

The situation doesn't get much better.  However much optization has been
done on our huge-language compiler, that much more has been done on the C
compiler.  Maybe ten years after the introduction of the machine, the curves
will cross (*if* development continues on the compiler for a language which,
all things considered, is not being used much), but by then the hardware is
obsolete.  (It may still be in use, but if performance is our main concern,
we will have gone to something else.)

The desire to throw everything and the kitchen sink into a language is a
very natural temptation, but it is a mistake.
-- 

Frank Adams                           ihnp4!philabs!pwa-b!mmintl!franka
Ashton-Tate          52 Oakland Ave North         E. Hartford, CT 06108

cik@l.cc.purdue.edu (Herman Rubin) (04/30/88)

In article <1556@vaxb.calgary.UUCP>, radford@calgary.UUCP (Radford Neal) writes:
> In article <762@l.cc.purdue.edu>, cik@l.cc.purdue.edu (Herman Rubin) writes:
| 
| > ... The language, shell, window, editor, etc. developers
| > should try to find out everything that a programming genius would consider
| > including (and do not rely on what one genius wants; ask everybody) and try
| > to include it _all_.  In addition, the (whatever) should be designed so that
| > any used can expand it easily, because I can not tell you today about the
| > feature which I will consider "obviously" needed tomorrow.
| > 
| > -- 
| > Herman Rubin, Dept. of Statistics, Purdue Univ., West Lafayette IN47907
  
> GAK! Have you ever _tried_ designing, and implementing, a language, 
> editor, etc.?

I am not in a position to implement a language.  I have succeeded in
designing an assembler for a particular machine, which actually could
be easily made semiportable.  It is not that dissimilar from CAL, the
Cray assembly language.  The major problem with the languages, editors,
etc., is the fantastic number of conventions.  I doubt that there is any
language which has less conventional notation than any branch of mathe-
matics.  And the conventions of the languages are not usually not in an
"alphabetical" arrangement, so that one can deduce one convention from
the others.  The conventions of editors are worse; a given letter on one
has a vastly different meanin from the same on another.

If you look carefully at the part of my posting quoted, you will see that
I do not believe that a few people have the intelligence, knowledge, and
imagination to design a language, editor, etc.  How many of the screen
editors allow one to move vertically beyond the present scope of the line?
How many allow one to tie two lines together (i.e., to allow the motion of
characters in one line to move those in another)?  Why is there no WYSIWYG
editor which produces its output in such a way that it can be translated to
another system?

> The last thing you want to do is include everything
> anybody has ever thought might possibly be a good idea.

Of course, one cannot include everything.  But one can facilitate the 
addition of those things.  Many mathematics papers introduce notation
unknown to the reader.  Some of this even persists.  If a mathematician,
or group of mathematicians, attempted to force the notation of a field,
this effort would be profoundly resisted.  If they suggest a notation,
they may or may not succeed, and it is quite possible that the terminology
will be later modified.


> 
> As for "extensibility", it is much over-rated. Somehow, it always seems
> that the really useful modifications are beyond the capabilities of the
> extension mechanism. 

This means that the extension mechanism is too weak.  Most extension
procedures are overly restrictive, and do not assume that the user wants
to, say, introduce an operation which is not of the type envisaged by the
language designers.

> A universal operating system will be designed when someone very clever,
> imaginative, and artistic creates a reasonably simple model of computer 
> use that encorportates the needs of all users and is economically
> implementable in the technology of the day. If this should ever occur,
> the universal model will be much more likely to encorporate _none_ of
> the ideas of past "programming geniuses" than to encorporate them all.
> 
>     Radford Neal

A moderately universal operating system will be designed when the ideas of
hundreds of clever, imaginative, and artistic people, who know the inadequacies
of their ideas, are combined, probably informally, by people who are likewise
aware of their inadequacies, and are willing to make allowances for it.  If
you are arrogant enough to say that someone should not use a given construct,
you are unable to design even a fair system.  If you think you have all the
answers, you cannot design a fair system.  I have specifically argued against
the idea that there is even a complicated model of computer use that encorpor-
ates the needs of even a majority of the intelligent users, and the technology
of today changes so rapidly that Radford's ideas can only produce obsolete
systems.

-- 
Herman Rubin, Dept. of Statistics, Purdue Univ., West Lafayette IN47907
Phone: (317)494-6054
hrubin@l.cc.purdue.edu (ARPA or UUCP) or hrubin@purccvm.bitnet

mdg@smegma.UUCP (Marc de Groot) (05/02/88)

In article <2845@mmintl.UUCP> franka@mmintl.UUCP (Frank Adams) writes:
>The problem with this is that the more things you throw into a language, the
>harder it is to write a compiler for it.  And harder, when it comes to
>software, means it takes longer.

*UNLESS* you are working with an extensible language, in which case adding
features and constructs adds nothing to the implementation of a compiler or
interpreter or what-have-you.
-- 
Marc de Groot (KG6KF)
UUCP: {hplabs, sun, ucbvax}!amdcad!uport!smegma!mdg
AMATEUR PACKET RADIO: KG6KF @ KB6IRS 
"Look, he's mounting a tape!" "Quick, throw cold water on him!"

fpst@hubcap.UUCP (Steve Stevenson) (05/02/88)

 In article <762@l.cc.purdue.edu> cik@l.cc.purdue.edu (Herman Rubin) writes:
>The language ... should ... try to include it _all_.

You really don't want this WITHOUT a tremendous thought.  PL/I and ADA are
the obvious counterexamples.

This might be a desirable trait for some semantic/code models so that
a particular model might be easy to implement in a hurry.  But I
think history shows that when programmers are involved, there is
a fairly low complexity level which, if exceeded, makes the language
unpopular and eventually morabund.-- 
Steve Stevenson                            fpst@hubcap.clemson.edu
(aka D. E. Stevenson),                     fpst@clemson.csnet
Department of Computer Science,            comp.parallel
Clemson University, Clemson, SC 29634-1906 (803)656-5880.mabell

cik@l.cc.purdue.edu (Herman Rubin) (05/02/88)

In article <1543@hubcap.UUCP>, fpst@hubcap.UUCP (Steve Stevenson) writes:
> 
>  In article <762@l.cc.purdue.edu> cik@l.cc.purdue.edu (Herman Rubin) writes:
> >The language ... should ... try to include it _all_.
> 
> You really don't want this WITHOUT a tremendous thought.  PL/I and ADA are
> the obvious counterexamples.

The problems with PL/I, which did try to include it all, and ADA are that
they assumed that the bad notations of the predecessors should be continued.
The assembler notation, after which PL/I is largely modeled, is the main
reason that direct assembler code is not used more.  There is no machine
to my knowledge which is as complicated as BASIC, let alone the other HLLs.
ADA did not really try to include it all, and admittedly made no attempt to
provide easily writable code.  The resulting code was therefore not easily
readable.  We can do much better.

We cannot do even a fair job if the job is done by those who have designed the
present stuff.  We need people with imagination, who are aware _that we are
essentially ignorant_, and who are willing and eager to ask for what is wanted.
And they must realize that the resulting system will probably be inadequate,
and therefore must be easily extensible.  This means that the user must be
able to use notation which he considers easy to read, as long as that notation
is not already coopted, and in some cases it might be necessary to change
existing notation.  This is done in mathematics all the time, and does not
lead to problems.
-- 
Herman Rubin, Dept. of Statistics, Purdue Univ., West Lafayette IN47907
Phone: (317)494-6054
hrubin@l.cc.purdue.edu (ARPA or UUCP) or hrubin@purccvm.bitnet

mwm@eris (Mike (My watch has windows) Meyer) (05/03/88)

In article <380@smegma.UUCP> mdg@smegma.UUCP (Marc de Groot) writes:
<In article <2845@mmintl.UUCP> franka@mmintl.UUCP (Frank Adams) writes:
<>The problem with this is that the more things you throw into a language, the
<>harder it is to write a compiler for it.  And harder, when it comes to
<>software, means it takes longer.
<
<*UNLESS* you are working with an extensible language, in which case adding
<features and constructs adds nothing to the implementation of a compiler or
<interpreter or what-have-you.

But that leaves you with different subcommunities using different sets
of extensions, so that code must either carry along all the libraries
it needs to run in the base environment, or be ported to the new
environment (i.e. - the LISP community before CL).

Of course, after that you can standardize on one set of extensions.
Then you get a large, difficult-to-implement languaged (i.e. - the
LISP community after CL).

	<mike
--
The sun is shining slowly.				Mike Meyer
The birds are flying so low.				mwm@berkeley.edu
Honey, you're my one and only.				ucbvax!mwm
So pay me what you owe me.				mwm@ucbjade.BITNET

mwm@eris (Mike (My watch has windows) Meyer) (05/03/88)

In article <768@l.cc.purdue.edu> cik@l.cc.purdue.edu (Herman Rubin) writes:
<We cannot do even a fair job if the job is done by those who have designed the
<present stuff.  We need people with imagination, who are aware _that we are
<essentially ignorant_, and who are willing and eager to ask for what is wanted.

So, instead of just telling us that we're wrong, and that we should be
doing X, or Y and Z, how about _doing_ something about it?

<And they must realize that the resulting system will probably be inadequate,
<and therefore must be easily extensible.  This means that the user must be
<able to use notation which he considers easy to read, as long as that notation
<is not already coopted, and in some cases it might be necessary to change
<existing notation.  This is done in mathematics all the time, and does not
<lead to problems.

You've almost, but not quite, gotten to the nub of the problem. "What
the user considers easy to read." That varies from user to user. Some
people think LISP or FORTH are easy to read - others think that both
or Write-Only-Languages. Neither has been nearly as successful as C or
FORTRAN.  LISP even allows you to change the reader & printer to match
your favorite input syntax. None of the atlnerative systems (CGOL,
etc.) have been very popular in the LISP community.

When you complained about editors having a wildly different meanings
for different keys, you almost had it again. Editor wars are usually
considered religious wars, for good reason: what one person thinks is
an absolutely essential, must-have feature, another will think makes
the editor completely unusable (example - the "i" command from vi).
How do you build a standard in an environment where there isn't a
consensus on something as basic as what you do to indicate that you're
inputting new text?

So, do you have any suggestions as to what might be done to get around
this problem, instead of just maintaining that it _is_ a problem (I
don't think so - any more than having both a hammer and a screwdriver
is a problem)? Oh yeah - you might also let us know how many of the
languages outside the FORTRAN/ALGOL group (FP, CSP, Prolog, APL, etc.)
you've looked at.

	<mike

--
Don't tell me how to live my life			Mike Meyer
Don't tell me what to do				mwm@berkeley.edu
Repression is always brought about			ucbvax!mwm
by people with politics and attitudes like you.		mwm@ucbjade.BITNET

fpst@hubcap.UUCP (Steve Stevenson) (05/04/88)

From article <768@l.cc.purdue.edu>, by cik@l.cc.purdue.edu (Herman Rubin):
> 
> ....  This means that the user must be
> able to use notation which he considers easy to read, as long as that
> notation is not already coopted, and in some cases it might be necessary
> to change  existing notation.  This is done in mathematics all the time,
>  and does not lead to problems.

You had a sympathetic ear until this last sentence.  There are in fact
many folks - myself among them - who do exactly what you asked for.
But even within mathematics, there is no complete agreement.  Certainly,
within some subfield of analysis one might find a completely agreed
upon language.

But this is not necessarily true in another subfield.  What is done in
mathematics is that some definitions and local conventions are made.
This causes no problems for the human who is quite capable of semantic
shifting of this sort.  But local idioms are real hard for the compilers
to generate code for because they need special information.  I agree
that this is desirable - but as of this point, we can hardly formally
define what a computation is and what is computable.

I'd hate to see you try to get into defining Lebesgue integration for
your PC. :-)-- 
Steve Stevenson                            fpst@hubcap.clemson.edu
(aka D. E. Stevenson),                     fpst@clemson.csnet
Department of Computer Science,            comp.parallel
Clemson University, Clemson, SC 29634-1906 (803)656-5880.mabell

djs@actnyc.UUCP (Dave Seward) (05/04/88)

[line killer? what's that!]

In article <768@l.cc.purdue.edu> cik@l.cc.purdue.edu (Herman Rubin) writes:
>We cannot do even a fair job if the job is done by those who have designed the
>present stuff.  We need people with imagination, who are aware _that we are
>essentially ignorant_, and who are willing and eager to ask for what is wanted.
>And they must realize that the resulting system will probably be inadequate,
>and therefore must be easily extensible.  This means that the user must be
>able to use notation which he considers easy to read, as long as that notation
>is not already coopted, and in some cases it might be necessary to change
>existing notation.  This is done in mathematics all the time, and does not
>lead to problems.

                                                                              
-------------------------------------------------------------------------------
                                                                              

As an (unimaginative) compiler developer I am probably not qualified to comment
on this new language, but it sounds to me like what Mr. Rubin wants largely
exists now in FORTH. Not my preference, tho...

Dave Seward
uunet!actnyc!djs
-------------------------------------------------------------------------------
                                                                              
                                                                              
                                                                              
                                                                              

nevin1@ihlpf.ATT.COM (00704a-Liber) (05/04/88)

In article <438@ruuinf.UUCP> piet@ruuinf.UUCP (Piet van Oostrum) writes:
>In article <1556@vaxb.calgary.UUCP> radford@calgary.UUCP (Radford Neal) writes:

>   A universal operating system will be designed when someone very clever,
>   imaginative, and artistic creates a reasonably simple model of computer 
>   use that encorportates the needs of all users and is economically
>   implementable in the technology of the day. If this should ever occur,
>   the universal model will be much more likely to encorporate _none_ of
>   the ideas of past "programming geniuses" than to encorporate them all.

>How about a Turing Machine?

I don't know.  The infinitely long tape that it needs are usually out of
stock and are very expensive!! :-) :-)

Seriously, the problem with a Turing machine is that it is simple with
respect to the person implementing it.  Programming it is a nightmare!
(For those of you who don't think so, show me your Turing machine that
implements the Unix kernal.  Until then, ... :-))  A universal OS (as well
as a universal programming language), assuming that one exists, must be
simple and intuitive to use.  I, as the user, should never have to look at
a manual or go to a help screen.  It should be tailorable to me
automatically (by using heuristics, for instance), and if I am looking at
someone else's work it should, to me, look like my own.

Oh, well.  Enough dreaming. :-)

-- 
 _ __			NEVIN J. LIBER	..!ihnp4!ihlpf!nevin1	(312) 510-6194
' )  )				"The secret compartment of my ring I fill
 /  / _ , __o  ____		 with an Underdog super-energy pill."
/  (_</_\/ <__/ / <_	These are solely MY opinions, not AT&T's, blah blah blah

mmengel@cuuxb.ATT.COM (~XT4103000~Marc Mengel~C25~G25~6184~) (05/04/88)

In article <768@l.cc.purdue.edu> cik@l.cc.purdue.edu (Herman Rubin) writes:
<We cannot do even a fair job if the job is done by those who have designed the
<present stuff.  We need people with imagination, who are aware _that we are
<essentially ignorant_, and who are willing and eager to ask for what is wanted.

...
<able to use notation which he considers easy to read, as long as that notation
<is not already coopted, and in some cases it might be necessary to change
<existing notation.  This is done in mathematics all the time, and does not
<lead to problems.

That sounds great, but the real problem with that sort of a solution
comes in the maintenance of a program.  In our imperfect world, most
of the cost of a software product is in maintenance, not initial 
development.  The people who do maintenance work are often not the 
same people who  wrote the product originally.  In this universe
(a.k.a. "The Real World") the real problem with extensible, cusotmizable
languages is that a person who does maintenance programming  must be 
constantly switching back and forth from one persons notation to another.
People also have a tendency to assume their notation is "obvious" and
therefore do not document it sufficiently.

When an important factor in the program you write is that other people 
have to be able to understand your code quickly; an extensible, customizable
language where the semantics associated with a given syntax are not
constant detracts from the  goal of having understandable code.  The
original author may be able to follow it more easily, but any other
poor soul who has to read the code may be hoplessly confused.

The C preprocessor already gives us a glimpse of the horrible things
that one can do with #defines by making things that look like
function calls that aren't.  When you extend that to every nook and
cranny of the language, it gets more and more confusing for people
to read who are not familiar with the "notation" chosen by the original
programmer.
-- 
 Marc Mengel	

 attmail!mmengel
 ...!{moss|lll-crg|mtune|ihnp4}!cuuxb!mmengel

eugene@pioneer.arpa (Eugene N. Miya) (05/04/88)

In article <4624@ihlpf.ATT.COM> nevin1@ihlpf.UUCP (00704a-Liber,N.J.) writes:
>In article <438@ruuinf.UUCP> piet@ruuinf.UUCP (Piet van Oostrum) writes:
>>In article <1556@vaxb.calgary.UUCP> radford@calgary.UUCP (Radford Neal) writes:
>>   A universal operating system will be designed when someone very clever,
>>How about a Turing Machine?
>Seriously, the problem with a Turing machine is that it is simple with
>respect to the person implementing it.  Programming it is a nightmare!
>(For those of you who don't think so, show me your Turing machine that
>implements the Unix kernal.  Until then, ... :-))

I thought months ago about Turing machines. The problem is worse than
programming, worse than the Unix kernel [small and elegant as it is].
The problem is we lay requirements [ooh, bureaucratese] on computers which
are sometimes irrelevant to the conventional concept of computation: how
"user-friendly" is a Turing machine?  Does a Turing machine work in a
"networked" environment?  Anyways, these are not architectural questions, so
a follow up to comp.os.research is appropriate.  Get IT out of arch.

Lawrence Crowl writes:
>In article <25750@clyde.ATT.COM> gwu@clyde.UUCP (George Wu) writes:
>> ... [architecture class] ...  The actual architectures we studied were the
>>PDP-8, PDP-11, VAX, and MC68000. ...
>
>This strikes me as a rather narrow view of architecture.  There are no stack
>machines, no vector machines, no SIMD machines, etc.

Actually, that was the point of the architectural survey.  Not to prove it
exists, but to show specific deficiencies.  I fear there are too few stack,
vector and SIMD machines to go around and not enough interest in building
them since there's more money in micros.

Another gross generalization from

--eugene miya, NASA Ames Research Center, eugene@ames-aurora.ARPA
	resident cynic			soon to be aurora.arc.nasa.gov
at the Rock of Ages Home for Retired Hackers:
  "Mailers?! HA!", "If my mail does not reach you, please accept my apology."
  {uunet,hplabs,hao,ihnp4,decwrl,allegra,tektronix}!ames!aurora!eugene
  "Send mail, avoid follow-ups.  If enough, I'll summarize."

lisper-bjorn@CS.YALE.EDU (Bjorn Lisper) (05/06/88)

In article <4624@ihlpf.ATT.COM> nevin1@ihlpf.UUCP (00704a-Liber,N.J.) writes:
>In article <438@ruuinf.UUCP> piet@ruuinf.UUCP (Piet van Oostrum) writes:
>>In article <1556@vaxb.calgary.UUCP> radford@calgary.UUCP (Radford Neal)
>>writes:
>
>>> universal operating system will be designed when someone very clever,
>>>imaginative, and artistic creates a reasonably simple model of computer 
>>>use that encorportates the needs of all users and is economically
>>>implementable in the technology of the day. If this should ever occur,
>>>the universal model will be much more likely to encorporate _none_ of
>>>past "programming geniuses" than to encorporate them all.
>
>>How about a Turing Machine?
>
>I don't know.  The infinitely long tape that it needs are usually out of
>stock and are very expensive!! :-) :-)

No, no, no, the tape is finite at every time but it must be possible to
extend it to an arbitrary (but finite) length! So every physical Turing
Machine should be placed next to a tape factory that can provide it with
more tape when the need arises.... :-) :-) :-)

Bjorn Lisper

mouse@mcgill-vision.UUCP (der Mouse) (05/10/88)

[> and >>> = Herman Rubin, cik@l.cc.purdue.edu]
[>> = Radford Neal, radford@calgary.UUCP]
>>> ... The language, shell, window, editor, etc. developers should try
>>> to find out everything that [anyone wants] and try to include it
>>> _all_.

This is a nice ideal.  However, it has problems.

- Just finding out what everyone wants will take forever.  I don't just
  mean a long time, I mean forever.  By the time you've finished asking
  everyone, the people you've already asked will have changed their
  minds, and new people with new opinions and desires will be here.

- The result will be unimplementable.  It will be so large that not
  only is it impossible for one person to understand the whole thing,
  but the number of people necessary to do it will be so large that the
  project will collapse under its own weight.  Read Fred Brooks' book,
  and imagine a system two orders of magnitude larger than what he's
  describing.

- Once implemented, even assuming it ever gets done, the result will be
  too large to run on enough machines to make it useful, and too slow
  to be useful even on the machines it will fit on.  (Perhaps not;
  hardware capacity and speed are exploding fast enough that by the
  time your project is done, it might be able to run.)

>> GAK!  Have you ever _tried_ designing, and implementing, a language,
>> editor, etc.?
> I am not in a position to implement a language.

Yet you try to tell the language designers and implementors their job?
I, too, believe current languages are imperfect.  I, however, realize
that there are probably reasons why this is true.  I have done some
preliminary thinking on designing and building a new language, and the
more thought I put into it, the more I realize why things are the way
they are.

> The major problem with the languages, editors, etc., is the fantastic
> number of conventions.  I doubt that there is any language which has
> less conventional notation than any branch of mathematics.

This is unclear.  "...has less conventional notation than...": does
this mean "...has notation which is less conventional than that of..."
or does it mean "...has less of conventional notation than..."?

Either way, so what?  I fail to see why you keep trying to draw
analogies with mathematics.

> The conventions of editors are worse; a given letter on one has a
> vastly different meanin from the same on another.

To draw an analogy of my own with mathematics, how many different,
incompatible, meanings does the `+' symbol have?

> Of course, one cannot include everything.  But one can facilitate the
> addition of those things.  Many mathematics papers introduce notation
> unknown to the reader.

This happens when programming, too: it's called defining a function (or
declaring a procedure, or whatever the idiom is for the language in
question).

> Some of this even persists.

This corresponds to what are called libraries.

>> As for "extensibility", it is much over-rated.  Somehow, it always
>> seems that the really useful modifications are beyond the
>> capabilities of the extension mechanism.
> This means that the extension mechanism is too weak.

Unfortunately, we don't know how to design a really general extension
mechanism, unless you count giving out the source code as providing an
extension mechanism.

> Most extension procedures are overly restrictive, and do not assume
> that the user wants to, say, introduce an operation which is not of
> the type envisaged by the language designers.

Most programs of any size introduce operations not envisaged by the
language designers.  They are called "functions".  For example, the
editor I'm using to type this, a derivative of Gosling Emacs, has an
internal operation which sets something called a marker.  The designers
of C, the language it's written in, did not explicitly provide for this
operation, but they provided a mechanism for defining this operation,
and it works.

>> A universal operating system will be designed when [...read the
>> original if you want the details].  If this should ever occur, the
>> universal model will be much more likely to encorporate _none_ of
>> the ideas of past "programming geniuses" than to encorporate them
>> all.

I agree with this, that the next truly great operating system,
language, editor, whatever, is much likelier to incorporate none of the
ideas of the past than to incorporate them all.

> I have specifically argued against the idea that there is even a
> complicated model of computer use that encorporates the needs of even
> a majority of the intelligent users, and the technology of today
> changes so rapidly that Radford's ideas can only produce obsolete
> systems.

Until the next genius comes along, we have to do what we know how to
do.  From what I've seen of your ideas, I don't expect them to produce
any sort of system, while if we keep doing what we know how to do, we
will keep getting useful (though imperfect) systems out.  I may be
wrong; someone may build a system based on your ideas.  But I'll have
to see it to be convinced.

					der Mouse

			uucp: mouse@mcgill-vision.uucp
			arpa: mouse@larry.mcrcim.mcgill.edu

mouse@mcgill-vision.UUCP (der Mouse) (05/10/88)

In article <768@l.cc.purdue.edu>, cik@l.cc.purdue.edu (Herman Rubin) writes:
> In article <1543@hubcap.UUCP>, fpst@hubcap.UUCP (Steve Stevenson) writes:
>>  In article <762@l.cc.purdue.edu> cik@l.cc.purdue.edu (Herman Rubin) writes:
>>> The language ... should ... try to include it _all_.
>> You really don't want this WITHOUT a tremendous thought.  PL/I and
>> ADA are the obvious counterexamples.
> The problems with PL/I, [...more about what the problems are...]
> We can do much better.

I am not convinced we can.  Show me.

> This means that the user must be able to use notation which he
> considers easy to read, [...].

The problem is that the next person won't consider it easy to read, or
at least not as easy to read.  You rapidly wind up with a maintenance
nightmare.  See the Obfuscated C Code Contest for examples of what can
be done with a language that isn't even particularly well suited to
introducing new notation!  (Can you *imagine* what might come out of an
Obfuscated C++ Code Contest?!)

> This is done in mathematics all the time, and does not lead to
> problems.

Mathematical papers are much, much, smaller than computer programs.
The source to the editor I'm using right now, for example, is between
19 and 20 thousand lines.  This would be approximately three hundred
pages of paper.  How many math papers are that long?  And this isn't
even a really large program.  I just did a quick scan of the kernel
source directory, and I would estimate that it's close to 150 thousand
lines, or two to two-and-a-half thousand pages.

For programs on the same order of magnitude of size as the typical math
paper, I would expect problems no more serious than arise in
mathematics.

					der Mouse

			uucp: mouse@mcgill-vision.uucp
			arpa: mouse@larry.mcrcim.mcgill.edu

mouse@mcgill-vision.UUCP (der Mouse) (05/10/88)

In article <4624@ihlpf.ATT.COM>, nevin1@ihlpf.ATT.COM (00704a-Liber) writes:
> A universal OS (as well as a universal programming language),
> assuming that one exists, must be simple and intuitive to use.  I, as
> the user, should never have to look at a manual or go to a help
> screen.

Unfortunately, what is simple and intuitive to one person isn't to
another.

> [...] if I am looking at someone else's work it should, to me, look
> like my own.

I don't expect this any sooner than I expect Turing-capable AI
programs.  Style is too many things, including things too subtle to
easily change.  What you are asking for, in essence, is something that
looks at (say) a program, deduces what it does (as distinct from how it
does it), and re-does the same thing the way you would have done it.
Among other things, this implies that it's at least as intelligent as
you are.  Now this may be possible in some cases (seeing some of the
software coming over the net), but it surely is not possible for all
people.

					der Mouse

			uucp: mouse@mcgill-vision.uucp
			arpa: mouse@larry.mcrcim.mcgill.edu

nevin1@ihlpf.ATT.COM (00704a-Liber) (05/12/88)

[followups to comp.misc; I can't figure out where else to send it]

In article <1090@mcgill-vision.UUCP> mouse@mcgill-vision.UUCP (der Mouse) writes:
>In article <4624@ihlpf.ATT.COM>, nevin1@ihlpf.ATT.COM (00704a-Liber) writes:
>> A universal OS (as well as a universal programming language),
>> assuming that one exists, must be simple and intuitive to use.  I, as
>> the user, should never have to look at a manual or go to a help
>> screen.

>Unfortunately, what is simple and intuitive to one person isn't to
>another.

True, but we are moving in that direction.  For instance:  most people find
a Mac-like windowing interface simple and intuitive (whether they like it
is a differnt story; they can figure out how to use it).

>> [...] if I am looking at someone else's work it should, to me, look
>> like my own.

>I don't expect this any sooner than I expect Turing-capable AI
>programs.

If they aren't going to be Turing-compatible, then what kind of AI programs
do you expect?

>Style is too many things, including things too subtle to
>easily change.  What you are asking for, in essence, is something that
>looks at (say) a program, deduces what it does (as distinct from how it
>does it), and re-does the same thing the way you would have done it.

This is what I am expecting in the far future (3 years :-)).  Right now,
though, is it asking too much to have an editor indent a program the way I
use indentation, or suggest that I define one keystroke for something that
I do a lot that requires many keystrokes (for instance :g/^>/s//| in vi)?
With current software technology, we have programmable editors.  I want
that to go one step further; I shouldn't have to program it; with some
heuristics, AI, and the ability to study my keystrokes an editor should be
able to make life a little easier for me.
-- 
 _ __			NEVIN J. LIBER	..!ihnp4!ihlpf!nevin1	(312) 510-6194
' )  )				"The secret compartment of my ring I fill
 /  / _ , __o  ____		 with an Underdog super-energy pill."
/  (_</_\/ <__/ / <_	These are solely MY opinions, not AT&T's, blah blah blah