[comp.sys.amiga.tech] Why do you need metaphor?

schow@bnr-public.uucp (Stanley Chow) (08/11/89)

In article <440@xdos.UUCP> doug@xdos.UUCP (Doug Merritt) writes:
> [discussion of how ps works on Unix.]                      In almost
>all other ways, Unix very consistently treats everything like a file.
					^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
It is not clear (at least to me) that this is a good thing. Why do you
want everything to look like a file? A file has a very limiting structure.
Why would you want to force complicated data structure into a file? I think
appropiate use of linked data structure is entirely correct.

>
>And even with ps, Unix is nominally within the boundaries of that
>consistency, for it reads a symbol table from a file (/unix or /vmunix)
>to find the offsets in files of what it wants (/dev/mem and /dev/kmem).
>There is no poking directly around in memory.  I agree that this is a
>grey area and that it is inelegant on an absolute scale, but it is
>relatively elegant compared with the AmigaDOS method of directly accessing
>device/task data structures in your address space without any benefit
>of any consistency metaphor.
		    ^^^^^^^^

This is perhaps the key. I look on metaphor as an aid to learning something.
Once I know the subject, *it* becomes the metaphor for learning something
else. Why is it good to tie everything forever to a bad metaphor (files)?


>
>Both Unix and AmigaDOS would benefit from extensions that put the
>list of tasks (apparently) into a directory where they could be listed
>(and perhaps deleted, etc) using standard utilities. Unlike AmigaDOS, this
>ability *is* about to appear in a mainstream Unix: V5.4.

Again, my question: why should deleting a task be the same as deleting an 
entry from a directory? The code gets more complicated since the delete 
command now must know to call the delete-task routine. The user still has to
know that a task is involved so there is no hope of 'undo'.

Perhaps what you want is a 'consistant' interface where command names and
syntax for different commands follow a consistant convention. This is
very differnt from having the 'same' command do it in the same way.

>  [...]                     But the lesson from history (of Unix) that
>is most important, yet most widely disregarded, is that it *is* consistent,
>and the user *can* do all that stuff from the command line, and that
>all those utilities *do* behave well e.g. with piped input/output.

Not meaning to start another Unix flame war, but I think you mean that *you*
have found *a set* of utilities that fit well together with *your* style of
computer usage. The default set(s) of utilities from the variouse vendors
are by no means 'consistant'. Including all the freebies and bought S/W in
any description of 'consistant' is at best inconsistant.

>
>Unix has, compared with everything but the Smalltalk kernel, the greatest
>degree of consistency of application of its underlying metaphor, and that
>*inherently* makes it more powerful *overall* than any system with a
>lesser consistency. Don't even bother to criticize it until you've got
>a system that beats it in this respect. If you can set up message passing
>ports between tasks purely via the CLI in AmigaDOS, and if all standard
>utilities support this, then you've got something that might start making
>Unix look like a toy. But meanwhile, pretending that e.g. AmigaDos is
>*already* far better than Unix simply prevents learning from history.
>

I know almost nothing about any Unix-type kernal, so I cannot comment on the
kernal internals. I do, however have strong views about the consistancy
as seen by the *user*. In my view, Mac is more consistant than Amiga and
Amiga is more consistant than Unix.



Stanley Chow        BitNet:  schow@BNR.CA
BNR		    UUCP:    ..!psuvax1!BNR.CA.bitnet!schow
(613) 763-2831		     ..!utgpu!bnr-vpa!bnr-fos!schow%bnr-public
Me? Represent other people? Don't make them laugh so hard.

rtczegledi@crocus.waterloo.edu (Richard Czegledi) (08/11/89)

Something that I feel would be very useful (not to mention nice) is to
give the amiga the ability to put 'files' under 'files'.  Waterloo Port
does something like that.

It could function just like a directory, but could contain a text file.

If something like this was implemnted, it could really clean up our
directories.  No more 'read.me read.me1st dist' etc.. files.  Some of them
could act like the directory, and have the files they refer to 'underneath'
them.  Also, .info files wouldn't clutter up the directories anymore.
.info files could be placed under their executables, and when you wish
to copy the file, you just do a normal copy, and the .info file will
go with it.

All little files that are support files for the executable could also
be contained underneath the executable.  This would mean that instaling
programs and copying certain things to certain directories would essentialy
become pointless, and from a user standpoint, the system would be simpler
to understand.  If you did a dir, you could have a display like this:

+ise JRCOMM  +ie Preferences  (or something like that)

That would mean that the file JRCOMM is an executable, has support files
under it, and has an info file for workbench.  Preferences is an executable
has info files for workbench, but doesn't requrie support files.

It would make the amiga much simpler to maintain, give it prettier un-
cluttered directories.  And if wb1.4 decides to let the user list files
by text (instead of just icons), you could have a little icon beside the
filename, indicating what, if anything is underneath it.

possibilities possibilities possibilities!
e

xanthian@well.UUCP (Kent Paul Dolan) (08/19/89)

[Lots of stuff about how the Unix "file" metaphor isn't so great after all
omitted in the interests of retaining sanity.]

Computer science hasn't stopped progressing just because Unix(tm) was
invented.  The plaudits for the Unix "file" metaphor are not meant to
denigrate later developments, but to compare it to the systems existing
when Unix was created.

If you go back and look at the '60's operating systems (OS 360, GCOS, etc.),
you find that the OS was responsible for knowing the structure of the
bytes in a file.  The result was the JCL nightmares of fixed, varying, varying
spanned, and so on record structures, each of which was built into the OS,
and all of which had to be describable by the system user in JCL.

The glory of Unix is the realization that the OS has no business knowing the
internal structure of a file; a file is just a stream of bytes.  It is the
responsibility of the program accessing those bytes to interpret some
structure into them.

This paradigm, mapped to devices, allowed a console to be treated as a file
too, since all that ever goes to or from it is also a stream of bytes.

Most subsequent progress has been built on the base furnished by Unix of
making the OS (relatively) smaller, simpler and cleaner, since it only
has to deal with one file type, the stream of bytes.

Of course, within it's own files, the OS knows the structure, but it is
amazing how much of other OS's internal code became separate, non-OS
programs under Unix.

Somewhere in there, the idea of bundling the process and its data became
the concept of an abstract data type so dear to the object oriented
programmers of today.

Not too scruffy from a beginning of "keep it simple, stupid." ;-)

well!xanthian
Kent, the man from xanth, now just another echo from The Well.

eachus@mbunix.mitre.org (Robert Eachus) (08/22/89)

In article <13207@well.UUCP> xanthian@well.UUCP (Kent Paul Dolan) writes:

>If you go back and look at the '60's operating systems (OS 360, GCOS, etc.),
>you find that the OS was responsible for knowing the structure of the
>bytes in a file.  The result was the JCL nightmares of fixed, varying, varying
>spanned, and so on record structures, each of which was built into the OS,
>and all of which had to be describable by the system user in JCL.

>The glory of Unix is the realization that the OS has no business knowing the
>internal structure of a file; a file is just a stream of bytes.  It is the
>responsibility of the program accessing those bytes to interpret some
>structure into them....

     If the idea of "unstructured" files came originally from Unix, or
if the reason that Unix originally treated files the way it does had
anything to do with your reasons, or if the files in Unix were "just a
stream of bytes", I probably wouldn't have commented.  But...THAT
ISN'T THE WAY IT HAPPENED, THAT ISN'T WHY IT HAPPENDED, and finally
THAT ISN'T WHAT FILES ARE IN UNIX!  Please don't rewrite history,
especially where it is important like here!

     Unix was written to be a small single user system by people
who had been part of the Multics effort before Bell Labs pulled out.
One of the key principles of Multics was that there was no distinction
between disk (or originally drum) files and main memory (back then
core).  Dynamic linking, segment tables, snapping of links, packed
pointers and unpacked pointers, and a hierarchical memory  structure
were all tools devloped to make that user abstraction work.  (There
were other principles, and other innovations to support them, involved
in the design of Multics, but that is another story.)

     Multics users (and at that time all such users were developers)
quickly became used to the programming paradigm that the way to change
a file was to operate on it in random access fashion, in memory.  The
fact that Multics used demand paged virtual memory to make that
efficient in all cases was hidden.  (Maybe that wasn't entirely
another story. :-) 

    The developers of Unix wanted to work in the same fashion, and
having internal structure to files was an unnecessary nuisance.  So
the file system was designed so that the internal structure of files
was hidden from the users.  (But files had structure, and programmers
needed to understand it, unlike on Multics.)  The "new" file system
had different structure, and it is what Unix systems commonly use
today, but boy was it a porting nightmare.

    Note that Multics files have structure normally (very well) hidden
from the user, Unix files also have a hidden (but not so well
originally) structure, and AmigaDOS learned the lesson well.  (So
well, in fact that I can't remember ANY tools, programs, or files
which wouldn't work with the 1.0 files system, the Old File System,
and the Fast File System.  (The incorrect reporting of file sizes in
earlier releases of info and list to my mind was a bug.)  There is
good documentation for how all of those file systems work, and if you
really believe that those file systems are unstructured, go read it!

>Somewhere in there, the idea of bundling the process and its data became
>the concept of an abstract data type so dear to the object oriented
>programmers of today.

     Huh?  There are lots of definitions of abstract data types
floating around, so someone, somewhere, must have proposed one like
yours, but you missed the mark again.  The idea of an abstract data
type is to combine the definition of the type with the operations on
the type, so that the details are hidden from the user.  That is a
good fit with modern file systems.

>Not too scruffy from a beginning of "keep it simple, stupid." ;-)

     This is a real insult, however you are probably, based on the
above comments, unaware of just how insulting it is.  There are
SHELVES of papers, almost all of which make interesting reading today,
on the design details you dismiss so glibly.  Making these things have
a simple appearing paradigm, and what that paradigm should be, were a
decade of labor for hundreds of people who thought that there was a
better model of computer human interaction than was embodied in the
early 360's.  (Note that I probably should have said IBM 704, but the
IBM 704 was the ancestor of the 7094, which was the predecessor of the
GE 635...  The announcement of the 360 series was the step that
caused MIT to turn to GE for the processor to run Multics, but I digress.)

     Minicomputers, and CRT terminals, and interactive computing, and
the box that you have on your desk were the product of a lot of
visionary people who saw how good it could be, and spent years in the
wilderness making it happen.  Some eventually got rich, but a lot of
people who could have deserted the cause, stuck through good times and
bad, to the idea that batch processing on mainframes was NOT the way
to use computers.  The first computer I programmed was an IBM 650. I
still have a copy of the first assembler manual at home (GP for the
Univac I).  I played Spacewar on the PDP-1 and MIT in 1964.  I worked
on debugging the first time sharing systems.  WHERE WERE YOU?

					Robert I. Eachus

with STANDARD_DISCLAIMER;
use  STANDARD_DISCLAIMER;
function MESSAGE (TEXT: in CLEVER_IDEAS) return BETTER_IDEAS is...

      If thare are people who want to reminisce about the old days,
I'd love to, but lets keep it to e-mail.  I only posted this because I
don't want young programmers kept in the dark.  Santayanna, got it
right, and I remember seeing a column (in Datamation I think, eight or
ten years ago when it was still worth reading) about visiting a
microcomputer show in the Z80 days, and recognizing a lot of the old
mistakes being repeated.

xanthian@well.UUCP (Kent Paul Dolan) (08/26/89)

In article <64502@linus.UUCP> eachus@mbunix.mitre.org (Robert I. Eachus) writes:
[Screenfuls of completely missing the point omitted.]

Sure, I you, and everyone else in the civilized universe knows that there
_IS_ a complex structure of inodes, block accessing, and whatever hidden
down under the apparently simple interface to Unix files.  The point,
which escaped you to the point of near apoplexy, is that applications
programmers from the student beginner can completely ignore that complex
structure and treat the file as a stream of (randomly accessible) bytes.

All the blurb about Multix is also wide of the mark.  I was responding
to a series of articles all claiming in essence "Unix isn't so great,
look at this (long-after-Unix-developed) much nicer way to chose a
metaphor of device access"; and I attempted to provide a bit of historical
context to the discussion.  This does not require me to trace the concepts
back to Babbage, and frankly, except for being the predecessor that drove
the Unix developers away to do things better, Multics was a dinosaur, however
nice the ideas incorporated.  Don't believe me?  Check its market share
versus Unix or even the OS-360's children. ;-)

I even have my very own History of Programming Languages; I'm not at all
unaware of the history; in fact, in some small parts, I was a part of it.

>>Not too scruffy from a beginning of "keep it simple, stupid." ;-)
>
>     This is a real insult, however you are probably, based on the
>above comments, unaware of just how insulting it is.

a) That funny looking thing is a smily face.  Welcome to the net, BIFF!
It is exactly the ideas that make a complicated subject simple that
improve the world.  The Unix file as a stream of bytes paradigm made a
horrendously complicated process in every other file system much easier
to comprehend.  And no, that god-awful segmented mess that was Multics
idea of a file doesn't begin to compare.

b) The insult is in your own mind, not the text you read.  When you
take the time to read what is in front of you, perhaps your written
responses will contain a bit more light and less heat.

>The first computer I programmed was an IBM 650. I
>still have a copy of the first assembler manual at home (GP for the
>Univac I).  I played Spacewar on the PDP-1 and MIT in 1964.  I worked
>on debugging the first time sharing systems.  WHERE WERE YOU?

Looks like by the time you got started, youngster, I was already on
my fourth or fifth programming language.

>					Robert I. Eachus
>

Who has certainly had better days than that one!

>
>      If thare are people who want to reminisce about the old days,
>I'd love to, but lets keep it to e-mail.  I only posted this because I
>don't want young programmers kept in the dark.  Santayanna, got it
>right, and I remember seeing a column (in Datamation I think, eight or
>ten years ago when it was still worth reading) about visiting a
>microcomputer show in the Z80 days, and recognizing a lot of the old
>mistakes being repeated.

We have the same goals; wonder how come your approach fails to meet the
need?  Think, then type and you'll surely do better.

well!xanthian
Kent, the man from xanth, now just another echo from The Well.