[comp.society.futures] Where are the small thinkers?

oster@dewey.soe.berkeley.edu (David Phillip Oster) (10/25/87)

This is a reply to an article in comp.society.futures. I am
crossposting to comp.sys.mac, because it has some interesting things
to say about why the Mac is the way it is.

Take a look at the Canon Cat. Jef Raskin, who was the original person
at Apple working on the Macintosh (before Steve Jobs replaced him with
himself.) started a company called "Information Appliances". He has
been working for a number of years on his vision of what an
information appliance is. 

The company's first product was a ROM board that fit into an Apple II,
and made it behave similarly to their second product, the Canon Cat.

Imagine a Macintosh with: only 256k of memory, no mouse, no graphics,
no file system, no operating system, and only one font, which is not
even proportionally spaced. The keyboard is permantly attached, and
the micro-floppy is beside the display instead of below it.

On the Macintosh, a floppy is ejected under software control so a
naive user cannot remove it when it is in an invalid state. The Cat
has the old fashioned and error-prone direct push button.

No file system: it stores a single document on each floppy. (Documents
can have sections, but the user is responsible for maintaining the
table of contents.)

No operating system: it comes up in one program, a simple word
processor. The word processor has one level of undo (but you can't
undo typing, so it isn't clear which operations are undoable.) 

The word processor has a certain amount of extra functionality over,
say , the original wordstar. It has a dictionary, and a "Spell" key. Select
an arithmetic expression, and hit the calculate key, and it will
fill in the result after an "=" sign. Select a phone number and hit
"dial" and it apparently adds the entire telecommunications session to
the current document. (So much for using a remote screen editor.)
Spell, calculate, and dial are done by holding down a "blue" special
shift key, then pressing a letter that has that command written on its side.

No mouse: not even any arrow keys. Motion and selection are both done
by incremental search: you hold down a special search forward or
search back shift key while you type. These special shift keys, called
"leap keys" are located below the space bar. Let go of the leap key
and you are out of search mode. Press it again, and you search again.
(I think having these is a good idea. Getting rid of the mouse and
arrow keys, and every other difference from the Mac I've described
here is a bad idea.)

Price is between $1000.00 and $1500.00 I think.

Conclusion: the machine is a small, simple solution to a problem that
is becoming more and more obselete. It is a great
computer-as-super-personal-typewriter, in a world where the typewriter
is going the way of the dinosaur, being replaced by the personal
typesetter+drafting table. If this is Jef Raskin's vision of the
people's computer, I can only say, Thank God for Steve Jobs.

catty comment: I'm told the system software is all written in Forth.
Forth programmers always do a quick job that solves the problem, only
the solution is (1) only 80% there, and (2) usually the wrong problem.
(I know, I used to be a forth programmer, and I did things like write
a complete music editor in 1 day.)

This whole thing was written in response to an article in
comp.society.futures on "Whatever happened to Small is Beautiful?",
Lamenting that the window systems and text processors of today are
bigger and more poewerful than the ones of a few years ago. One answer
is today's systems don't solve the same problems. a Second answer is:
Want to buy my 48kRAM CP/M system? It can run old WordStar, and I'll
sell it to you for only $300.00, which is 1/10 what I bought it for 5
years ago. It works fine, and still does everything it ever did, I
just don't want that anymore.


--- David Phillip Oster            --I'm not an actor, but I play one
Arpa: oster@dewey.soe.berkeley.edu --on TV.
Uucp: {uwvax,decvax,ihnp4}!ucbvax!oster%dewey.soe.berkeley.edu

glg@sfsup.UUCP (G.Gleason) (10/27/87)

In article <21430@ucbvax.BERKELEY.EDU> oster@dewey.soe.berkeley.edu.UUCP (David Phillip Oster) writes:
	[ stuff deleted about "Cannon Cat", and how it probably
	  takes the KISS principles much too far ]
>This whole thing was written in response to an article in
>comp.society.futures on "Whatever happened to Small is Beautiful?",
>Lamenting that the window systems and text processors of today are
>bigger and more poewerful than the ones of a few years ago. One answer
>is today's systems don't solve the same problems. a Second answer is:
>Want to buy my 48kRAM CP/M system? It can run old WordStar, and I'll
>sell it to you for only $300.00, which is 1/10 what I bought it for 5
>years ago. It works fine, and still does everything it ever did, I
>just don't want that anymore.

I think you have gotten to the heart of the problem.  Small *is*
beautiful, but I think a lot of people have a distorted idea of what
small is.  First of all, economy of concepts is probably much more
important than saving bytes in core or on disk.  We are still on the
curve of reduction in size, cost, power, etc. of the hardware, and
software engineering is still in its infancy.  The programs and systems
may get much more complex, but they will do many more functions without
overloading us with new details to learn (the manuals get smaller).

Nobody knows better that many contemporary systems are a pain to use than
computer profesionals who must use them every day.  I am constantly
frustrated by having to use shell commands with zillions of criptic
options, but on the other hand I often find that MS/DOS just doesn't have
the power to do what I need.  The future is the power (and more) of UNIX,
and the ease of use of MAC type software.  Needless to say the programs
to accomplish this will be more complex, but they will also be more bug-
free and easier to use.  Today, to install and use a UNIX system, you
need to have an administrator who is fluent in the use of UNIX.  This is
too much to ask for the typical installation.  In the future this function
needs to be taken over by an "expert system", that is, a piece of software
that does what the systems administrator does today (obviosly this will
not make the software smaller and less complex).

Gerry Gleason

bzs@BU-CS.BU.EDU (Barry Shein) (10/29/87)

There is another version of "small" that is more tied to a software's
design than its mere size. It has to do with fighting the permutations.

I have a friend who says that anything with more than 4 simultaneous
options is doomed. He reasons that that's probably a dozen or more
viable combinations and the human mind can usually only juggle several
on a good day.

Thus we see the negativity towards a Unix command like 'ls' where I
can type 'ls -lutc' and lord knows what will come out. At another
extreme is an editor like emacs which has thousands of options and
commands but seems to work well enough because I rarely have to
consider their interaction unless I'm hacking it. The major command
interaction is pre-fixing a command with a count. Some may disagree
with my examples but I'm sure you can find your own.

I think to some extent this dimension is as important as any measure
of "small". Perhaps "simple" is beautiful. I believe Dennis Ritchie
remarked in his BSTJ article (my memory might fail me here, it's not
handy) that the smallness of the PDP-11 systems they had to work on
forced smallness and this smallness in turn forced simplicity and
a need for an elegant power in their design.

I don't think smallness for smallness' sake is the point, it's clarity
of design (which sometimes is borne of constraints, but not limited to
it nor even ensured by it, RT-11's syscall interface was surely small
and constrained but it was still complex, like I said, it only takes
about four interacting things to cause problems.)

	-Barry Shein, Boston University

dan@WILMA.BBN.COM (10/30/87)

I agree; smallness is good when it refers to smallness of the design,
not smallness of the code.  It's really a matter of the complexity of
the mental model you have to carry around when you want to think about
the whole system.  Most systems grow more complex when people work on
them, because they enhance by adding a new feature.  But it's usually
better to go over the assumptions that went into the design and try to
add or change one assumption.  Unfortunately this requires a global
view of the system that most maintainers don't have, as well as a
strong commitment to the ideal of simplicity; it's HARD to enhance a
system this way!

If you can't add a capability merely by rethinking a design
assumption, you could try instead to keep the total number of features
constant: if you are adding a new feature, take an old one out.
During a Berkeley Steering Committee discussion a few years back about
adding symbolic links to UNIX, Dennis Ritchie argued that if symbolic
links were going to be added, then hard links (that is, the ability to
have multiple hard links to the same file) should be removed.  There
should only be one way to do something: if you're adding a new way
because the old way wasn't good enough, then the new way should
replace the old one.  I didn't agree at the time (and neither, I
think, did anyone else) because of the compatibility problems it would
cause.  Now, I'm not so sure.  Keeping the old feature around makes
the system harder to think about.  Supersets should supersede!

So what happened to the small philosophy?  It's just temporarily
suspended.  Originally simplicity in programming was a virtue because
there just wasn't much room to do things in.  Now that's not a problem
anymore, but we've found a new reason for simplicity: it makes a
system easier to understand, use, and maintain.

But it's hard to design systems that way, even though it's worth it.
It doesn't come naturally to most computer programmers, even good ones
(even ones who've enhanced UNIX in significant ways).  In the future I
think we'll see more emphasis on design skills in computer science.
Planning a large computer program requires as much design skill as
planning a large building, but while architects take classes in
design, computer programmers usually take no classes in design at all.
In the future I think that will change.  Already there are computer
classes (at Berkeley, I think) in which students write a program in a
team, then give the program to another team to modify.  They are
scored both on the original program and how easy it was to change.

I would also like to believe that object-oriented languages encourage
better design, because they can make it easier to think about the
design assumptions and the whole design process.  But I don't know if
that's really true.  Perhaps someone who has modified someone else's
large SmallTalk program can comment...

	Dan Franklin (dan@bbn.com)