[comp.sys.next] replacing the desktop metaphor

bonar@pitt.UUCP (Dr. Jeffrey Bonar) (12/19/88)

I have an invitation for net readers - create a metaphor for computing 
systems that goes beyond the desktop cliche.  Four years ago, Apple 
had something with the Macintosh desktop: a new way to think about 
computing.  Now, everyone is copying the desktop: Microsoft, IBM, 
AT&T.  Even the new NeXT machine provides little more than a 
desktop with some cute simulated depth.

Marshall McLuhan said that a new medium always began by 
imitating the old medium: cow paths were paved to make roads for 
the "horseless carriage", film began by putting a camera in front of a 
play, and finally, computer screens now look like a desktop.  What if 
we really let go into our new medium; what should a computer work 
space really look like?

William Gibson described a cyberspace where computer cowboys 
shared a:
 
"consensual hallucination experienced daily by billions of 
legitimate operators, in every nation, by children being taught 
mathematical concepts ... A graphic representation of data 
abstracted from the banks of every computer in the human 
system.  Unthinkable complexity.  Lines of light ranged in the 
nonspace of the mind, clusters and constellations of data.  Like 
city lights, receding ..."  (pg 51, Ace paperback edition of 
Neuromancer)

What does your cyberspace, or whatever you would call it, look like.  
I'm interested in suggestions that are practical and serious, in 
particular, suggestions constrained by current technology in screens, 
keyboards, mice, etc.  I'm also interested in suggestions that are 
fanciful and poetic. 
 
We get to create a medium from scratch - what should it look like.
Note: please mail your suggestions to me directly.  I will post a 
collection of the results.

Send suggestions to: 

	Internet: bonar@vax.cs.pittsburgh.edu

or, using normal mail:

	Jeffrey Bonar
	708 LRDC
	University of Pittsburgh
	Pittsburgh, PA  15260

cory@gloom.UUCP (Cory Kempf) (12/21/88)

Sorry that this is so long.

In article <4362@pitt.UUCP> bonar@pitt.UUCP (Dr. Jeffrey Bonar) writes:
>
>I have an invitation for net readers - create a metaphor for computing 
>systems that goes beyond the desktop cliche.
[...]
>						what should a computer work 
>space really look like?
>
>William Gibson described a cyberspace...

(I ignored the request to e-mail on the subject, 'cause I think that this
needs a wider discussion)

Up until about four years ago, all interaction to a computer was through
a single linear path (well, two actually).  The characters were typed on
the keyboard, and characters appeared on the screen.  The UNIX especially
is designed around this concept of a serial communication line.  It's
networking utilities (rsh, rlogin, rcp, etc) are the utilities that are 
implied by this.  

About a year ago, there was an idea for networking the mac (actually, I think
it was for a multiuser mac) that included the concept of a primative cyber-
space.  It was based on the desktop metaphor of the mac.  

The idea was to extend the desktop of the mac they way it is done with 
multiple moniters on a mac II.  Give each user their own mouse/keyboard.
If a user wanted, he could walk around the extended desktop with the mouse
the same way that is done with Close View.  Also, the user could pass a window
to someone else's desktop so that they could work on the application as well.

Each user could of course customize their own desktop much the way that
is done now.  

What I would like to see is the desktop metaphor extended into 3D, say
for example, an office.  You would have a desktop, a trashcan, a phone,
an inbasket/outbasket, a filesystem, etc.  Each of the services that are 
offered by the system are represented as an object in the office.  If you 
go out through the door, you find yourself in the hall (network), and from 
there can go into someone else's office (the outbasket & phone act in a 
predictable manner).

It'll be expensive (in terms of cpu time/bandwidth) but I think that it will
be worth it in the long run.  The way that you interact with the computer
in part determines the ways that you will consider using it.  (ex: desktop
publishing out of the Mac)

Comments?

-- 
Cory (...your bravest dreams, your worst nightmare...) Kempf
UUCP: encore.com!gloom!cory
	"...it's a mistake in the making."	-KT

kwc@naucse.UUCP (Ken Collier) (12/22/88)

In article <257@gloom.UUCP>, cory@gloom.UUCP (Cory Kempf) writes:
> In article <4362@pitt.UUCP> bonar@pitt.UUCP (Dr. Jeffrey Bonar) writes:
> >
> >I have an invitation for net readers - create a metaphor for computing 
> >systems that goes beyond the desktop cliche.
> [...]
> >						what should a computer work 
> >space really look like?
> >
> >William Gibson described a cyberspace...
> 
> What I would like to see is the desktop metaphor extended into 3D, say
> for example, an office.  You would have a desktop, a trashcan, a phone,
> an inbasket/outbasket, a filesystem, etc.  Each of the services that are 
> offered by the system are represented as an object in the office.  If you 
> go out through the door, you find yourself in the hall (network), and from 
> there can go into someone else's office (the outbasket & phone act in a 
> predictable manner).

To take your idea a bit further how about getting rid of the CRT display
and creating a holographic display which could be projected anywhere 
in your office (or workspace) that is convenient.  When your computer is
not in use (if that is possible) none of your desk space would be taken up.

You would even be able
to determine the metaphor that gets projected.  Say, Cory's idea of the 
office with the doors and hallways, etc.  Or possibly a "zoomed" view of
the desktop with all of it's accouterments.  Manipulation of this projection
would be the user's choice of a standard keyboard or some sort of pointing
device.  Perhaps we could even come up with a way that the user could 
physically manipulate the holographic characters.  

Remember the chess game in one of the Star Wars movies, where the holographic
images battled one another until one was "killed"?  If Lucas Films can
do it, it must be possible! :-)

Ken Collier
College of Engineering and Technology
Northern Arizona University
Flagstaff, Arizona

landman%hanami@Sun.COM (Howard A. Landman) (12/22/88)

In article <257@gloom.UUCP> cory@gloom.UUCP (Cory Kempf) writes:
>What I would like to see is the desktop metaphor extended into 3D, say
>for example, an office.  You would have a desktop, a trashcan, a phone,
>an inbasket/outbasket, a filesystem, etc.  Each of the services that are 
>offered by the system are represented as an object in the office.  If you 
>go out through the door, you find yourself in the hall (network), and from 
>there can go into someone else's office (the outbasket & phone act in a 
>predictable manner).

Yes, but if I'm alternately working in my office and someone else's far away,
I want to be able to switch back and forth quickly.  How about teleport
booths instead of hallways?

(Sorry this isn't followed up to alt.cyberpunk, but Sun's news server
won't allow posting to alt groups.)

	Howard A. Landman
	landman@hanami.sun.com

remy@cit-vax.Caltech.Edu (Remy Sanouillet) (12/22/88)

In article <257@gloom.UUCP> cory@gloom.UUCP (Cory Kempf) writes:
[Previous article omitted.]
>
>About a year ago, there was an idea for networking the mac (actually, I think
>it was for a multiuser mac) that included the concept of a primative cyber-
>space.  It was based on the desktop metaphor of the mac.  
>
>The idea was to extend the desktop of the mac they way it is done with 
>multiple moniters on a mac II.  Give each user their own mouse/keyboard.
>If a user wanted, he could walk around the extended desktop with the mouse
>the same way that is done with Close View.  Also, the user could pass a window
>to someone else's desktop so that they could work on the application as well.
>
>Each user could of course customize their own desktop much the way that
>is done now.  
>
>What I would like to see is the desktop metaphor extended into 3D, say
>for example, an office.  You would have a desktop, a trashcan, a phone,
>an inbasket/outbasket, a filesystem, etc.  Each of the services that are 
>offered by the system are represented as an object in the office.  If you 
>go out through the door, you find yourself in the hall (network), and from 
>there can go into someone else's office (the outbasket & phone act in a 
>predictable manner).
>
>It'll be expensive (in terms of cpu time/bandwidth) but I think that it will
>be worth it in the long run.  The way that you interact with the computer
>in part determines the ways that you will consider using it.  (ex: desktop
>publishing out of the Mac)
>
>Comments?
>
>-- 
>Cory (...your bravest dreams, your worst nightmare...) Kempf
>UUCP: encore.com!gloom!cory
>	"...it's a mistake in the making."	-KT

This is basically the subject of my PhD dissertation. It extends
Fred Thompson's "New World of Computing (tm)" natural language system
to a host of networked users. Each user works in a "context", basically
his environment with his customized slang based on his native language,
(currently the system understands English, French and Italian but other
tongues are in the works.)

But the user can open up his context to the rest of the world using
several different methods. One is called "basing" and involves
incorporating another context (i.e. Dow Jones, Sears catalog) by
creating virtual links to it.

My role is allowing users to share their contexts which contain
data base objects in several different mediatic forms (entities,
texts, pictures, sound recordings, etc...) by opening up a common
window where each user retains his/her means of control. They each
have a cursor, mouse pointer or whatever pointing device their
computer supports, and a voice link is opened for direct communication.
This allows, for example, a team of designers scattered all over the
world to all lean over the same blueprint, give advice, make changes, 
querry the data base to find who is affected by the change, get them
in on the meeting and send the revised project to manufacturing.

If all goes well, my prototype should be working in a few months.
The way we see it, this is going to be the next mutation of the
telephone and computer into one standard device in every home.
Seing how the previous mutation (the Minitel in France) has
generated such a thrill in the general french user community,
there is little doubt that we are heading for some exciting days.

-------------------------------------------------------------------------------
Remy Sanouillet                      |      E-mail: remy@caltech.BITNET
256-80 Caltech                       |              remy@csvax.caltech.edu
Pasadena, CA 91125                   |              ...seismo!cit-vax!remy
Tel. (818) 356-6262                  |                              

mlandau@bbn.com (Matt Landau) (12/22/88)

In comp.windows.misc, landman@sun.UUCP (Howard A. Landman) writes:
>In article <257@gloom.UUCP> cory@gloom.UUCP (Cory Kempf) writes:
>>What I would like to see is the desktop metaphor extended into 3D, say
>>for example, an office.  You would have a desktop, a trashcan, a phone. . .
>
>Yes, but if I'm alternately working in my office and someone else's far away,
>I want to be able to switch back and forth quickly.  How about teleport
>booths instead of hallways?

This is beginning to sound in some ways like a network-distributed version
of Rooms, the new "desktop thing" from ParcPlace.

If you haven't seen/heard of Rooms, you should look into it.  It presents 
you with multiple workspaces, called rooms, each of which contains the set 
of tools commonly used for doing some job (i.e., a collection of windows 
and programs).  There are doors for going from room to room, bags in which 
you can carry things from one room into another, and pockets in which you
can put things that you want to follow you around (like a clock, or your
phone) as you move from room to room.
--
 Matt Landau			Waiting for a flash of enlightenment
 mlandau@bbn.com			  in all this blood and thunder

osmigo@ut-emx.UUCP (12/22/88)

[replacing the desktop metaphor, for heaven's sake]

I've often wondered if wireless technology would ever hit computers. Many
top rock and roll bands, for example have a little 3" antenna sticking out of
their electric guitars instead of a clumsy cable. Then there are wireless
telephones, not to mention all kinds of "remote controls" that work via
infrared pulses. So how about a wireless keyboard and/or mouse?

To go the route:

Flat-screen monitor 2 inches thick, 3 feet square. Hang it on the wall. The
CPU and magnetic/optical storage device (!) are stuck under a desk or something.
The keyboard has a trackball on one end of it, and interfaces with a
small infrared sensor in one corner of the monitor. Want to use the computer?
Take the keyboard out of your desk drawer, hit the "on" button, and compute
away. When you're finished, hit "off" and chuck the keyboard back into the
desk drawer, your briefcase, or whatever. 

A monitor as described is a ways down the road, for sure, but I don't know
why a wireless keyboard would be far-fetched. It'd only have to send a
hundred or so infrared (or other) pulses: one for each ASCII character.
There are video/audio remotes on the market right now that do more than that.

Ron

=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+
>  Ron Morgan      {ames, utah-cs, uunet, gatech}!cs.utexas.edu!ut-emx!osmigo  <
>  Univ. of Texas    {harvard, pyramid, sequent}!cs.utexas.edu!ut-emx!osmigo   <
>  Austin, Texas          osmigo@ut-emx.UUCP       osmigo@emx.utexas.edu       <
=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+
 

jbn@glacier.STANFORD.EDU (John B. Nagle) (12/22/88)

       One can get carried away with mapping information into an illusion of
physical reality.  I've seen a number of examples of this in research
systems.  The architect of Microsoft Windows once was involved with the
development of a word processor in which one deleted words by dragging
them to the "trash can".  This proved not to be a useful concept.  At
Xerox PARC, I've seen a system in which windows and objects on the screen
have gravity, inertia, friction, and resiliency.  Move a window with the
mouse, release it while it's still moving, and it continues to move until
it hits something, then bounces.  Cute, but not useful.  The VPL people
tout the notion of programming by wiring functional units together with
"wires" on-screen.  This may appeal to the fraction of the population
that enjoys wire-wrapping.  A version of this concept for the little
ones is already available, called Robot Odessey.  This game, from The
Learning Company, allows kids to wire up simple robots with sensors,
"reaction jets", bumpers, and such, interconnected with logic built up
from AND and OR gates, flip-flops and inverters.  One can even create
new ICs; enough real estate is available in an IC for about a dozen
components.  An amusing game, but a painful way to get work done.

      More later if the dialog gets serious.
      					
      					John Nagle

rlr@utastro.UUCP (Randy Ricklefs) (12/22/88)

> [references to other article deleted]
> Flat-screen monitor 2 inches thick, 3 feet square. Hang it on the wall. The
> CPU and magnetic/optical storage device (!) are stuck under a desk or something.
What about replacing the entire top of a desk with some type of flat display?
Then add a touch-screen mechanism to drag papers, forms, etc. across the desk
top to work on them.  (Then your electronic desktop could become as cluttered
as the physical one. :-) )

> A monitor as described is a ways down the road, for sure, but I don't know
> why a wireless keyboard would be far-fetched.

I believe IBM used an IR-based detached keyboard on their much-maligned
PC jr.  I haven't seen anyone take up that gauntlet since then.
> 
> Ron
> 
  Randy
-- 

                       Randy Ricklefs
       uucp:  {ut-sally, ut-ngp, noao, charm}!utastro!rlr
  arpa:  rlr@astro.AS.UTEXAS.EDU       phone:  (512) 471-1342

timd@cognos.uucp (Tim Dudley) (12/22/88)

In article <257@gloom.UUCP> cory@gloom.UUCP (Cory Kempf) writes:
>Sorry that this is so long.
>
>In article <4362@pitt.UUCP> bonar@pitt.UUCP (Dr. Jeffrey Bonar) writes:
>>
>>I have an invitation for net readers - create a metaphor for computing 
>>systems that goes beyond the desktop cliche.
>[...]
>
>What I would like to see is the desktop metaphor extended into 3D, say
>for example, an office.  You would have a desktop, a trashcan, a phone,
>an inbasket/outbasket, a filesystem, etc.  Each of the services that are 
>offered by the system are represented as an object in the office.  If you 
>go out through the door, you find yourself in the hall (network), and from 
>there can go into someone else's office (the outbasket & phone act in a 
>predictable manner).
>
etc...

This looks to me like an application of the Rooms metaphor proposed by Card
and Henderson out of Xerox PARC (and more recently Europarc).  As I remember,
the Rooms metaphor presented a means of linking desktops through "windows" and 
"doors", in such a way that if you wanted to look at another application, or
view of an application, you did it through a "window", but if you wanted to
launch an application, you did it by going through a "door" into the "room"
in which the application was resident.  Seems to me that the idea of having
one of the "doors" lead into a hall (network) is a good one.

The Rooms metaphor has been published in several places, including ACM
Transactions on Graphics (don't remember which one, but it's relatively
recent).  It strikes me as being more closely related to hypermedia than to
3D...


-- 
Tim Dudley                           Cognos Incorporated 
(613) 738-1440                       3755 Riverside Drive, P.O.Box 9707 
uucp: uunet!mitel!sce!cognos!timd    Ottawa, Ontario, Canada  K1G 3Z4
 "It's a damn poor mind that can think of only one way to spell a word." 

roberts@cognos.uucp (Robert Stanley) (12/22/88)

In article <257@gloom.UUCP> cory@gloom.UUCP (Cory Kempf) writes:

>In article <4362@pitt.UUCP> bonar@pitt.UUCP (Dr. Jeffrey Bonar) writes:
>>
>>I have an invitation for net readers - create a metaphor for computing 
>>systems that goes beyond the desktop cliche.
>[...]
>>what should a computer work space really look like?

>What I would like to see is the desktop metaphor extended into 3D, say
>for example, an office.  You would have a desktop, a trashcan, a phone,
>an inbasket/outbasket, a filesystem, etc.  Each of the services that are 
>offered by the system are represented as an object in the office.  If you 
>go out through the door, you find yourself in the hall (network), and from 
>there can go into someone else's office (the outbasket & phone act in a 
>predictable manner).

The HyperCard Navigator II stack is an interesting experimental example of
structuring a familiar metaphor for the work space.  It presents a full
office, complete with a desk which has both a desktop and a set of drawers
for files and for miscellaneous bits and pieces.  It is clearly a richer
metaphor than the simple desktop, even though this example is full of
cutsie-pie off-the-cuff symbology which turns out not to completely capable
of generalization.  (Some of the symbols work, some don't, and most are
specific rather than generic.  Of course, with HyperCard you can roll your
own anyway, so perhaps this is a specious argument).  However, in keeping
with the concept of HyperCard as a personal data-access manager, there is
no provision in the Navigator II example to wander out of your office.  It
ought to be real easy to do, however (just another card, right?), and tied
into AppleTalk (sorry, LocalTalk) there are some nice possibilities.

The big, big problem with any of this stuff is that in order for environment
A to report to its user what is going on in environment B absolutely requires
environment B to expend some resource in telling environment A what is
going on.  What is more, this has to be done when environment A wants it,
not when it is "convenient" for environment B.  In practical terms, it must
be possible for any active environment to continuously support a back-
ground task with the sole purpose of supplying local information to remote
requestors.  Not impossible, but it raises the spectre of security, and
it's not the nicest kind of app to try and write on the Mac, even under
multi-finder.

On the subject of how such a metaphor might actually work, there are
currently a couple of interesting lines of work at Xerox (remember them?
Created that Star thingy...): they are devoting considerable attention to
the whole field of computer support for co-operative working, and on the
Interlisp machines they have a really interesting meta-windowing system
known as Rooms.  Each full-screen display is simply a room, and a meta-
navigational system allows you to open and close doors between rooms, and
to move from room to room.  I am too lazy to open my filing cabinet and
look right now, but if anyone cares I can dig out formal references.
There's lots of stuff on rooms, which you can buy if you have the right
kind of hardware, and the co-operative working was shown as one of the
video sessions at the Washington DC 'CHI conference earlier this year.

Robert_S
-- 
Robert Stanley - Cognos Incorporated: 3755 Riverside Drive, P.O. Box 9707, 
Compuserve: 76174,3024		      Ottawa, Ontario  K1G 3Z4, CANADA
uucp: uunet!mitel!sce!cognos!roberts             Voice: (613)738-1338 x6115
arpa/internet: roberts%cognos.uucp@uunet.uu.net    FAX: (613)738-0002

maxsmith@athena.mit.edu (Samuel M Druker) (12/23/88)

Believe it or not, Commodore used to have a cartridge for their 64K machine
that worked along that exact idea.  It had an office with a phone, trash can,
filing cabinet, typewriter, clock, calculator, etc.   My dad (expert in 
computer illiteracy) loved it.


==============================================================================
Samuel Druker			|	ARPA: maxsmith@athena.mit.edu
Zortech, Inc.           	|	Bix: maxsmith
Arlington, MA           	|	"Basically, my boss doesn't even
    				|	  *know* I'm on the net."

gckaplan@soup.ssl.berkeley.edu (George Kaplan) (12/23/88)

In article <8939@ut-emx.UUCP> osmigo@emx.UUCP (Ron Morgan) writes:
[suggestions for wireless keyboards, displays]
>
>                                                     ...  but I don't know
>why a wireless keyboard would be far-fetched. It'd only have to send a
>hundred or so infrared (or other) pulses: one for each ASCII character.
>There are video/audio remotes on the market right now that do more than that.
>

A wireless keyboard has already been implemented on at least one
computer:  the IBM PCjr.  Its keyboard used infrared to communicate
with the cpu box.  It worked pretty well through a wide range of
angles, although as I recall you had to have a clear line of sight 
between the keyboard and the cpu.  There were interference problems 
sometimes if you tried to put two cpu's next to each other.

The PCjr was a flop, but most likely for reasons other than the fact
that the keyboard was wireless.

George C. Kaplan		Internet:  gckaplan@sag4.ssl.berkeley.edu
Space Sciences Lab		UUCP:  ...!ucbvax!sag4.ssl!gckaplan
University of California	
Berkeley, CA  94720

casseres@Apple.COM (David Casseres) (12/23/88)

In article <17917@glacier.STANFORD.EDU> jbn@glacier.UUCP (John B. Nagle) writes:
>
>       One can get carried away with mapping information into an illusion of
>physical reality.

Yes indeed.  The most striking thing about the "desktop" in the Lisa/Mac
user interface is how little it resembles an actual desk top: just enough
to provide familiarity and some intuitive operations.

Metaphor is a very limited tool in user-interface design, and has to be
used with great restraint.

David Casseres

welch@tut.cis.ohio-state.edu (Arun Welch) (12/23/88)

In article <12417@garnet.BBN.COM>, mlandau@bbn.com (Matt Landau) writes:
> In comp.windows.misc, landman@sun.UUCP (Howard A. Landman) writes:
> >In article <257@gloom.UUCP> cory@gloom.UUCP (Cory Kempf) writes:
> >>What I would like to see is the desktop metaphor extended into 3D, say
> >>for example, an office.  You would have a desktop, a trashcan, a phone. . .
> >
> >Yes, but if I'm alternately working in my office and someone else's far away,
> >I want to be able to switch back and forth quickly.  How about teleport
> >booths instead of hallways?
> 
> This is beginning to sound in some ways like a network-distributed version
> of Rooms, the new "desktop thing" from ParcPlace.
> 
> If you haven't seen/heard of Rooms, you should look into it.  It presents 
> you with multiple workspaces, called rooms, each of which contains the set 
> of tools commonly used for doing some job (i.e., a collection of windows 
> and programs).  There are doors for going from room to room, bags in which 
> you can carry things from one room into another, and pockets in which you
> can put things that you want to follow you around (like a clock, or your
> phone) as you move from room to room.
> --

I haven't seen the ParcPlace implementation, but the Envos
implementation for their Lisp environment is pretty spiff (which is
actually the original).  The article describing it can be found in CHI
86, by Austin Henderson and Stuart Card.  Interestingly enough, I'm in
my "Telnet" room as I write this...  The Envos implementation extends
the metaphor by allowing you to include any room in any other, not
just the Pockets.  This allows you to do things like have a
Control-Panel room, which you include in every room, and use your
Pockets for just carrying things between rooms.  Henderson and Card
implemented it originally in Interlisp, and for efficiency reasons
Envos re-implemented it (once upon a time, you could go get a cup of
coffee while it switched rooms, if you were fast and your rooms were
busy), enhancing it considerably along the way.  I believe it's still
an ongoing resarch project at Parc, however.  Incidentally, it makes a
pretty useful slide-show system too, for demos.  You just have each
room be a different slide, describing how your system has changed as
it goes along....:-).  I've been using the new Rooms for about 6
months now, and really love it. 

...arun

shiffman%basselope@Sun.COM (Hank Shiffman) (12/23/88)

In article <12417@garnet.BBN.COM> mlandau@bbn.com (Matt Landau) writes:
>This is beginning to sound in some ways like a network-distributed version
>of Rooms, the new "desktop thing" from ParcPlace.

Just a small correction.  Rooms comes from Envos, not ParcPlace.  An
easy mistake to make, since they were both spun off from Xerox.

-- 
Hank Shiffman                                     (415) 336-4658
AI Product Marketing Technical Specialist       ...!sun!shiffman
Sun Microsystems, Inc.                          shiffman@Sun.com

Zippy sez:
  JAPAN is a WONDERFUL planet -- I wonder if we'll ever reach
 their level of COMPARATIVE SHOPPING...

tim@mcrware.UUCP (Tim Harris) (12/23/88)

There actually has been one system I have seen with an infra-red keyboard, it
is a Fujitsu machine available only in Japan called the FM-77A/V.  The A/V 
stands for Audio/Video as the machine has both excellant graphics and full
stereo sound.  It has the computer in a very small main unit which contains
two 3" floppys and an infra-red detector.  The keyboard is compeletely wireless
and uses infra-red.  The mouse, however, is still wired to the machine.  

There is a wireless tracking device scheduled for use on the new Philips CD-I
player systems that consists of a 10 key keypad, a thumb-ball tracking device 
and several trigger buttons for the tracking device.  This is suppossed to 
be infra-red.  I've seen the pictures but not the device itself.

Tim

davef@Jessica.stanford.edu (David Finkelstein) (12/23/88)

In article <8939@ut-emx.UUCP> osmigo@emx.UUCP (Ron Morgan) writes:

>I've often wondered if wireless technology would ever hit computers. Many
>top rock and roll bands, for example have a little 3" antenna sticking out of
>their electric guitars instead of a clumsy cable. Then there are wireless
>telephones, not to mention all kinds of "remote controls" that work via
>infrared pulses. So how about a wireless keyboard and/or mouse?
>

Remember the IBM PC Jr?  It had a wireless keyboard.  Used infra-red
signals, I believe.  A friend of mine (who now works at Apple) wanted
to build a little box that would just send out the sequence for
Control-Alt-Delete, and walk into computer stores.

One neat thing that IBM has is "personal terminals."  (Well I thought
it was neat.)  The person who repairs our IBMs has one.  Basically
it's a small computer about 8"x4"x2", with a two or three line lcd
display.  It's cordless and antennaless, and connects via a cellular
network to IBM central.  From the field, they can send and receive
mail and messages, or ask about the availability of parts, or get more
detailed technical information.  Cellular remote computing...

**************************************************************
David Finkelstein		|davef@jessica.stanford.edu
Academic Information Resources	|
Stanford University		|     Just say "please."

casseres@Apple.COM (David Casseres) (12/23/88)

In article <3494@utastro.UUCP> rlr@utastro.UUCP (Randy Ricklefs) writes:

>What about replacing the entire top of a desk with some type of flat display?
>Then add a touch-screen mechanism to drag papers, forms, etc. across the desk
>top to work on them.  (Then your electronic desktop could become as cluttered
>as the physical one. :-) )

I think it was Johan Strandberg who said years ago that the "right" size
for the display is not "full-page" (uh, should it be legal size or metric?),
nor "two-page," but the size of an opened-up newspaper, since that is the
size that evolved as suitable for displaying multiple "windows" of inform-
ation.  Once you have that, he argued, you should build it into the top
of your desk -- not because it's supposed to represent a desktop, but
basically for ergonomic reasons and because once you have reached this
Nirvana, you won't need the desktop space for anything else!

David Casseres

edwardm@hpcuhc.HP.COM (Edward McClanahan) (12/23/88)

osmigo@ut-emx.UUCP writes:

> I've often wondered if wireless technology would ever hit computers.
...
> So how about a wireless keyboard and/or mouse?

IBM had such a keyboard - in the original PC Jr.
...I heard they had a problem when a classroom full of PC Jrs. were
   in use.  You see, each keyboard would broadcast in such a way that
   several PC Jrs. would receive it.  I suppose this has some benefits...

> Flat-screen monitor 2 inches thick, 3 feet square. Hang it on the wall.

Again, IBM has this device.  I think it sells for approx. $5000, though.

> The keyboard has a trackball on one end of it,...

I've even seen a Mac keyboard that has the trackball built in.

ed "discarding a better mouse trap" mcclanahan

gore@eecs.nwu.edu (Jacob Gore) (12/23/88)

/ comp.sys.next / gckaplan@soup.ssl.berkeley.edu (George Kaplan) / Dec 22 '88 /
>A wireless keyboard has already been implemented on [...]
>the IBM PCjr.  Its keyboard used infrared to communicate
>with the cpu box.  It worked pretty well through a wide range of
>angles, although as I recall you had to have a clear line of sight 
>between the keyboard and the cpu.

And that line-of-site restriction was a big limitation.  A friend of mine
used one when they were still "hot" (he worked for IBM).  His box & monitor
were on a shelf on his desk, and he had to tilt the keyboard up in order to
have it pointed at the box...

>The PCjr was a flop, but most likely for reasons other than the fact
>that the keyboard was wireless.

The wireless keyboard was a totally useless feature.  If you have to be
close to the monitor to read it, close to the box to work the floppy drives
(there was no hard disk, if I recall correctly), and... I don't remember if
the monitor and CPU were in the same box, but they were certainly attached
to each other.  So, you can't take the wireless keyboard far from the rest
of the machine, and you have to point it at the machine too.  Silly.

Jacob Gore				Gore@EECS.NWU.Edu
Northwestern Univ., EECS Dept.		{oddjob,gargoyle,att}!nucsrl!gore

hassell@tramp.Colorado.EDU (Christopher Hassell) (12/23/88)

In article <257@gloom.UUCP> cory@gloom.UUCP (Cory Kempf) writes:
#
#What I would like to see is the desktop metaphor extended into 3D, say
#-- 
#Cory (...your bravest dreams, your worst nightmare...) Kempf
#UUCP: encore.com!gloom!cory
#	"...it's a mistake in the making."	-KT

I have heard about a VERY interesting though likely to fail new method of 
3-d displays.  It basically is like a crt except that a mirror *vibrates*
at 60hz (probably audible) back and forth, producing a "scanned" block
of apparent display space left on the retina.  I would be cheap, but the
moving part aspect will probably kill it.

Any other Cheap <read Practical> ideas [Until Holograms can be dynamically 
   projected]?
### C.H. ###

jr@bbn.com (John Robinson) (12/23/88)

In article <5486@boulder.Colorado.EDU>, hassell@tramp (Christopher Hassell) writes:
>I have heard about a VERY interesting though likely to fail new method of 
>3-d displays.  It basically is like a crt except that a mirror *vibrates*
>at 60hz (probably audible) back and forth, producing a "scanned" block
>of apparent display space left on the retina.  I would be cheap, but the
>moving part aspect will probably kill it.

Not at all.  You have described the BBN Spacegraph.  It is not new at
all; maybe almost 10 years old by now.  It is a PC (as in IBM) board
plus a reasonable-performance oscilloscope with very low persistence
phosphors.  A rigid plexi mirror is energized by a good ol' woofer at
60 hz, causing it to change its focus and move the reflected image of
the oscilloscope face through an apparent depth of a few inches.  The
60hz is essentially inaudible (it may be 30 hz; you can paint both
comin' and goin' for an effective 60hz refresh).  The mirrors hold up
just fine.

I can provide a contact if anyone cares.
--
/jr
jr@bbn.com or bbn!jr

jr@bbn.com (John Robinson) (12/23/88)

In article <4441@Portia.Stanford.EDU>, davef@Jessica (David Finkelstein) writes:
>One neat thing that IBM has is "personal terminals."  (Well I thought
>it was neat.)  The person who repairs our IBMs has one.  Basically
>it's a small computer about 8"x4"x2", with a two or three line lcd
>display.  It's cordless and antennaless, and connects via a cellular
>network to IBM central.  From the field, they can send and receive
>mail and messages, or ask about the availability of parts, or get more
>detailed technical information.  Cellular remote computing...

IBM and Motorola co-developed these things, I believe.  Remember,
Motorola makes a lot of pager gear.  I agree they're cool.  Too bad
cellular is priced where only businesses can justify the cost, but
someday...  'Course, by then, the cellular frequencies will be long
since filled up.
--
/jr
jr@bbn.com or bbn!jr

timd@cognos.uucp (Tim Dudley) (12/23/88)

In article <8951@cit-vax.Caltech.Edu> remy@cit-vax.UUCP (Remy Sanouillet) writes:

 >
 >This is basically the subject of my PhD dissertation. It extends [etc...]

 >My role is allowing users to share their contexts which contain
 >data base objects in several different mediatic forms (entities,
 >texts, pictures, sound recordings, etc...) by opening up a common
 >window where each user retains his/her means of control. They each
 >have a cursor, mouse pointer or whatever pointing device their
 >computer supports, and a voice link is opened for direct communication.
 >This allows, for example, a team of designers scattered all over the
 >world to all lean over the same blueprint, give advice, make changes, 
 >querry the data base to find who is affected by the change, get them
 >in on the meeting and send the revised project to manufacturing.
 >

This is the idea of the common information space, which was a high profile
project at Bell-Northern Research in the early 70's, pioneered there by
Gordon Thompson and funded largely by Bell Canada.  You might want to save 
yourself some work and find out what they did there.

It's interesting.  Thompson was always referred to as a "futurist", and he
figured that meant he was about 20 years ahead of everybody else.  In this
case, he's pretty close.
-- 
Tim Dudley                           Cognos Incorporated 
(613) 738-1440                       3755 Riverside Drive, P.O.Box 9707 
uucp: uunet!mitel!sce!cognos!timd    Ottawa, Ontario, Canada  K1G 3Z4
 "It's a damn poor mind that can think of only one way to spell a word." 

timd@cognos.uucp (Tim Dudley) (12/23/88)

In article <82828@sun.uucp> shiffman@sun.UUCP (Hank Shiffman) writes:
>
>Just a small correction.  Rooms comes from Envos, not ParcPlace.  

WOW!  Do Card and Henderson know this??  Here they've had the wrong company on
all their papers they've published!

-- 
Tim Dudley                           Cognos Incorporated 
(613) 738-1440                       3755 Riverside Drive, P.O.Box 9707 
uucp: uunet!mitel!sce!cognos!timd    Ottawa, Ontario, Canada  K1G 3Z4
 "It's a damn poor mind that can think of only one way to spell a word." 

andy@cbmvax.UUCP (Andy Finkel) (12/24/88)

In article <8551@bloom-beacon.MIT.EDU> maxsmith@athena.mit.edu (Samuel M Druker) writes:
>Believe it or not, Commodore used to have a cartridge for their 64K machine
>that worked along that exact idea.  It had an office with a phone, trash can,
>filing cabinet, typewriter, clock, calculator, etc.   My dad (expert in 
>computer illiteracy) loved it.

It was called "The Magic Desk".  And don't forget that all important
door in the office...the gateway to the rest of the building.
(Or would have been, if the program had been a huge success :-)  )
-- 
andy finkel		{uunet|rutgers|amiga}!cbmvax!andy
Commodore-Amiga, Inc.

"Possibly this is a new usage of the word 'compatible' with which
 I was previously unfamiliar"

Any expressed opinions are mine; but feel free to share.
I disclaim all responsibilities, all shapes, all sizes, all colors.

ewiles@netxcom.UUCP (Edwin Wiles) (12/24/88)

In article <5486@boulder.Colorado.EDU> hassell@tramp.Colorado.EDU (Christopher Hassell) writes:
>In article <257@gloom.UUCP> cory@gloom.UUCP (Cory Kempf) writes:
>#
>#What I would like to see is the desktop metaphor extended into 3D, say
>#-- 
>#Cory (...your bravest dreams, your worst nightmare...) Kempf
>#UUCP: encore.com!gloom!cory
>#	"...it's a mistake in the making."	-KT
>
>I have heard about a VERY interesting though likely to fail new method of 
>3-d displays.  It basically is like a crt except that a mirror *vibrates*
[Edited...]
>Any other Cheap <read Practical> ideas [Until Holograms can be dynamically 
>   projected]?
>### C.H. ###

Yes.  Design your graphical interface to alternate rapidly between two
perspective images of the same object.  Interface that with a special pair
of glasses whos lenses are made of a rapid acting LCD material.  Set the
glasses so that the lenses alternate clear/dark in synch with the display.

The result of this is that your eyes each see only the perspective view
appropriate for that eye, and persistence of vision causes you to see
it in full color 3-D.  (None of this red/green junk!)

Such glasses and graphics already exist.  They are being used in at least
one video game (some sort of driving game); and are available on the open
market (not sure who from, check with comp.sys.amiga, since that's where
I saw it mentioned most recently).  I've also seen at least one NOVA program
that talked about them (computer graphics).
					Enjoy!
-- 
...!hadron\   "Who?... Me?... WHAT opinions?!?" | Edwin Wiles
  ...!sundc\   Schedule: (n.) An ever changing	| NetExpress Comm., Inc.
   ...!pyrdc\			  nightmare.	| 1953 Gallows Rd. Suite 300
    ...!uunet!netxcom!ewiles			| Vienna, VA 22180

c60a-2di@e260-2d.berkeley.edu (The Cybermat Rider) (12/24/88)

In article <1116@netxcom.UUCP> ewiles@netxcom.UUCP (Edwin Wiles) writes:
[Misc. stuff deleted]
>>Any other Cheap <read Practical> ideas [Until Holograms can be dynamically 
>>   projected]?
>>### C.H. ###
>
>Yes.  Design your graphical interface to alternate rapidly between two
>perspective images of the same object.  Interface that with a special pair
>of glasses whos lenses are made of a rapid acting LCD material.  Set the
>glasses so that the lenses alternate clear/dark in synch with the display.
>
>The result of this is that your eyes each see only the perspective view
>appropriate for that eye, and persistence of vision causes you to see
>it in full color 3-D.  (None of this red/green junk!)
>
>Such glasses and graphics already exist.  They are being used in at least
>one video game (some sort of driving game); and are available on the open
>market (not sure who from, check with comp.sys.amiga, since that's where
>I saw it mentioned most recently).  I've also seen at least one NOVA program
>that talked about them (computer graphics).
>					Enjoy!

For those who don't read comp.sys.amiga, here's the scoop (stolen from
Amazing Computing V3#9 - and I'm not affiliated in ANY WAY whatsoever with
the company mentioned):

The Product:  X-Specs 3D (as described above)

The Game:  (probably referring to) Space Spuds, a shoot-em-up type game,
           included with the X-Specs package

The Company:  Haitex Resources
              208 Carrollton Park Suite 1207
              Carrollton, TX 75006
              (214) 241-8030

The Price:  $125

The Note:  If you don't have an Amiga, you're out of luck!
           (Then again, if you DO, you probably already know this!)

>-- 
>...!hadron\   "Who?... Me?... WHAT opinions?!?" | Edwin Wiles
>  ...!sundc\   Schedule: (n.) An ever changing	| NetExpress Comm., Inc.
>   ...!pyrdc\			  nightmare.	| 1953 Gallows Rd. Suite 300
>    ...!uunet!netxcom!ewiles			| Vienna, VA 22180

----------------------------------------------------------------------------
Adrian Ho a.k.a. The Cybermat Rider	  University of California, Berkeley
c60a-2di@web.berkeley.edu
Disclaimer:  Nobody takes me seriously, so is it really necessary?

magik@chinet.chi.il.us (Ben Liberman) (12/24/88)

In article <1116@netxcom.UUCP> ewiles@netxcom.UUCP (Edwin Wiles) writes:
>In article <5486@boulder.Colorado.EDU> hassell@tramp.Colorado.EDU (Christopher Hassell) writes:
>>In article <257@gloom.UUCP> cory@gloom.UUCP (Cory Kempf) writes:
>>#What I would like to see is the desktop metaphor extended into 3D, say
>>#	"...it's a mistake in the making."	-KT

>>Any other Cheap <read Practical> ideas [Until Holograms can be dynamically 
>>   projected]?

>Yes.  Design your graphical interface to alternate rapidly between two
>perspective images of the same object.  Interface that with a special pair
>of glasses whos lenses are made of a rapid acting LCD material.  

A simpler solution might be to start with cheap polarized lenses 90 deg. out
of phase (movie 3-D glasses), and a very short persistance phospher on your
screen, and rotate a polarized filter, at 15 r.p.m. in front of the screen.



-- 
	------------	----------------------
	Ben Liberman  	magik@chinet.chi.il.us

bzs@Encore.COM (Barry Shein) (12/25/88)

This past week I attended a talk by Scott Fisher of NASA/AMES hosted
by the Boston Computer Society entitled "Artificial Reality".

He is working on a system which uses a helmet with stereoscopic
displays and head-motion sensors and data gloves (gloves you put on to
interact with what you are viewing in the helmet.)

As you turn your head the scene changes to match. You can use the
gloves for several types of input, grabbing things in the environment
and moving them about (one of the video tapes showed someone grabbing
menus and rearranging them in space), a simple sign language to do
things like change your perspective (a fist with the index finger
motioning upward started you moving up above the "room"), etc. The
gloves can also be visually replaced with a robot arm or any other
object so it corresponds with your motions.

One of the goals is to help design new control environments for NASA's
spacestation, rather than be confronted with a visual field of meters
and dials etc on a typical control panel operators would be able to
put a helmet on and set up a much more flexible, consistent
environment in, essentially, an empty room. Similar applications were
mentioned to help with EVA's (ie. space walks.)

Other applications by external groups included surgical assistance
(eg. views to help match up previously designed surgical goals during
an actual operation, bone-grafts were mentioned), three-dimensional
art work (sort of a three-dimensional paint program), being able to
interact with simulations (there was one tape of a shuttle exhaust
simulation you could step inside of or even become one of the
particles) and of course recreational applications (computer games, he
mentioned the possibilities for adventure-style games.)

My thought was wow, finally a solution to two people arguing about the
color schemes in the house, they can each have their own!

A version of the data glove is currently available from the same
company which has been working with the NASA/AMES group (I didn't
catch the name.) The helmet is still under devpt but Scott Fisher
assured the audience that there is a great deal of commercial
interest, he indicated there are still some details to be worked out
to make this viable (I believe he still has trouble generating the
graphics quickly enough to keep them in sync with all motions, the
result of falling behind is usually a bad case of motion sickness.)

Graphics were all wire-frame for now.

The gentleman who introduced Scott (sorry, forgot his name, probably
the president of BCS or some such person) made a funny remark about
wondering if one day he'll wake up and realize that he'd been wearing
an artificial reality helmet all his life...

	-Barry Shein, ||Encore||

rae@geaclib.UUCP (Reid Ellis) (12/25/88)

Ron Morgan <osmigo@emx.UUCP> writes:
|.. Many
|top rock and roll bands, for example have a little 3" antenna sticking out of
|their electric guitars instead of a clumsy cable. Then there are wireless
|telephones, not to mention all kinds of "remote controls" that work via
|infrared pulses. So how about a wireless keyboard and/or mouse?

People have pointed out the PCjr's IR keyboard, but I wonder why IR was
used and not RF?

As you point out, there are *two* kinds of wirless controls currently
in widespread use: RF and IR [Radio Frequency and InfraRed].  My question is
this: why hasn't anyone come up with an RF keyboard/whatever?  IR controls
are generally "aimed", in that you have to point at the device, more or less
in order to control it.  Yes, you can bounce off walls, and some units are so
powerful that if you are in the same room, it will work, but for the most part
you have to point at the thing to control it.  Now with a keyboard, you
don't want to restrict yourself to a few degrees of freedom [no pun intended
there:-)] because the infrared beam has to be pointed *there*.  It would
bye much nicer to be able to rest in any position you like and tap
away.  Now since we have other RF devices in the home [radio
telephones spring to mind] which also can co-exist nicely, using different
frequencies, why not keyboards?

Is it a question of cost?
-- 
Reid Ellis
geaclib!rae@geac.uucp

mel@mtfmi.att.com (M.HAAS) (12/25/88)

Let's take one step back and see if there is a metaphor or analogy to
what we are trying to do (find a better way to couple a computer
system to its human user).  Explore the history of other tools and
see how they solved the control/presentation problems.  Planes, cars,
machine tools, earth movers, cranes, calculators, ..... ?  I can't
think of any that use a metaphor (I have seen pictures of early cars
that used reins or boat tillers).

While the "desk top" seems to be a neat approach, and has sold a lot
of workstations, it seems to me to that it gets into the way much
more than it helps.  I don't spend significant time putting things
into folders and wastebaskets (maybe I should be :) ), and my real
desktop is 80% covered by non-computer stuff (coffee cup, jar of
pennies, stapler, WEBELOS handbook, roll of duct tape, bicycle
helmet, lots of bills and invoices, several tape cassettes, etc.).
And, I do a whole lot of things with the computer that have no
analogy whatever to a "desk" (run the C compiler?  start the
debugger?  rearrange a spreadsheet?  ???).

The "rooms" approach is worse (it doesn't even work in an art museum,
where that is isn't even a metaphor - at least, it doesn't work for
me - I always get lost).  The "movie screen" metaphor doesn't grab me
either.  The only multi-facet movie presentations I remember are
various sequences in "To Fly", and they all raised my adrenalin level
and prevented satisfactory examination of any one facet (notice that
the sales of multi-facet TV sets is nil).  Similarly for the "newspaper"
display - I tend to focus just on the column I am reading.  And,
notice the relatively high sales of similar material in much smaller,
more focused format (Daily News, National Enquirer, Reader's Digest,
Time).  Before (or while) searching for multi-mega-buck giant
and 3-D displays, how about finding control mechanisms for
19" high-res screens?

Is the metaphor search an attempt to extend the usefulness of
pointing input devices?  i.e. makeup for the deficiency in human
anatomy that doesn't allow both pointing and typing?  The helicopter
has the same problem.  The pilot needs one hand (and two feet) to
control motion in the translation plane, and the other hand to
control vertical motion.  Thus it is difficult to tune radios, set
the altimeter and gyros, aim lights, read maps, etc.  The solutions
for the helicopter case are: have multiple crew members, use an
auto-pilot to stabilize/lock some of the controls, and install
multiple extra control widgets on the control handles.  One of my
computer terminals has a scrollback memory for each window, but only
a mouse to activate it.  This makes the feature almost useless, since
the only time I want it is when my fingers are on the keyboard.
Similarly, moving about while text editing in any metaphor is
tedious, as is adding headings while drawing.  Doug Englebart and
others tried a one-handed keyboard to go with the mouse.  Others have
tried a foot operated trackball.  Could it be that there isn't a
solution?  They haven't found one for helicopters in 50 years of
trying.

Does having a metaphor extend the usefulness of the computer?
(I know it makes simple things easier to learn, and adds sales pizzaz
- worthwhile goals, but not the theme of these recent articles).
Is searching for a metaphor a good approach to solving the problem?

    Mel Haas  ,  attmail!mel

cjosta@taux01.UUCP (Jonathan Sweedler) (12/25/88)

In article <4479@xenna.Encore.COM> bzs@Encore.COM (Barry Shein) writes:
>
>This past week I attended a talk by Scott Fisher of NASA/AMES hosted
>by the Boston Computer Society entitled "Artificial Reality".
>
>He is working on a system which uses a helmet with stereoscopic
>displays and head-motion sensors and data gloves (gloves you put on to
>interact with what you are viewing in the helmet.)

The September 1988 Issue of Byte has an article on this input/output
system being designed at NASA.  By the way, the glove is called
DataGlove and is designed by VPL Research.  They also sell a product
called DataSuit.  This is an extension of DataGlove and it consists of
sensors in a suit that covers the user's whole body.  I won't go into
more detail.  You can just read the Byte article.  It's called "Between
Man and Machine."

In the same magazine, there is also a whole section on display
technology.  In the article "Face to Face" a 3D display system that is
being developed at TI is described.   The system is based on a rotating
disk, but this disk rotates at speeds of less than 10 Hz, so I don't
think it would have the low pitch humming noise that I imagine the BBN
Spacegraph has.  This system is described as a "real-time,
auto-stereoscopic, multiplanar three-dimensional display system." In
the prototype, the image can be viewed from any angle, and it has a
resolution of 500 by 500 pixels with a 4 inch depth of field.  Again,
see the article for more details.
-- 
Jonathan Sweedler  ===  National Semiconductor Israel
UUCP:    ...!{amdahl,hplabs,decwrl}!nsc!taux01!cjosta
Domain:  cjosta@taux01.nsc.com

shani@TAURUS.BITNET (12/25/88)

In article <353@internal.Apple.COM>, casseres@apple.BITNET writes:
>
> Yes indeed.  The most striking thing about the "desktop" in the Lisa/Mac
> user interface is how little it resembles an actual desk top: just enough
> to provide familiarity and some intuitive operations.
>

sure! why what makes you think it you need anything more to create a physical
realety? (ask Jim Henson...)

Now seriously. The biggest disadvantage of the desktop is that it provides
order only in the level of a single window (I'm always having a hard time
with remembering where the hell I left that damn' window?), and I think
HyperCard is a soloution to that and to most of the other disadvantages of
the desktop. Don't you?

O.S.

norman@cogsci.ucsd.EDU (Donald A Norman-UCSD Cog Sci Dept) (12/26/88)

     In article <850@mtfmi.att.com> mel@mtfmi.att.com (M.HAAS) writes:
     Let's take one step back and see if there is a metaphor or analogy to
     what we are trying to do ...:

I believe that Mel Haas has asked a critical question that should be
re-examined.  There is much blind faith in the use of metaphor and
"consistent mental model" (I know, some of propogated by me), but
examination of actual systems and a wide variety of tasks doesn't seem
to show much consistent use in actual practice.

The Apple (Xerox Star?) desktop metaphor isn't realy one of a desktop.
If anything, it is of spatial locations, containers, and movements.
Double clicking to "launch" applications and cutting, pasting,
copying, and undoing are all invaluable, but don't fit the same
metaphor.   Moving the disk image into the trash can to ejectthe disk
is a violation that bothers many people at first usage, but seems
perfectly natural after just one or two uses.   The trash can example
is one that bothers me a lot (intellectually) because it illustates a
real violation of principle that causes no problems in practice.
(Some try to save it by redefining ejection of a diskette as a kind of
"throwing away" but I think this is a feeble save.)

I suspect that metaphors are useful in keeping consistency.  But
now Jonathan Grudin is about to present a paper in CHI 89 arguing about
the foolishness of consistency: systems are often improved by
violations.  Even the Lisa/Macintosh deliberately violated consistency
principles when user testing showed it was better to do so.

(One of my favorite examples comes from t he old Tenex operating
system (for the DEC PDP-10 and then 20) which kept multiple versions
of files around, file operation commands always operated on the latest
version: move, copy, rename, print, mail, edit.  One command, however,
was incosnistsent: delete got rid of the oldest, not the latest
version (thank goodness). So much for consistency.

Yes, early technoogy almost always copies older ones: early
automobiles had tillers, typewriter keys were arranged like piano
keyboards (some even had black and white keys), the first plastic
items for home and office tried hard to look like wood, etc.  But this
copying usually gives way to exploitation of the real power of the new
technology, which is not to copy.

If you examine the way people speak, there is heavy use of metaphor
(see Lakoff & Johnson's book, for example).  But the metaphors are
more often inconsistent and mixed than consistent, yet they cause
very little difficulty (except to professors and newspaper columnists
who love to cite them as perversities).

I do believe that we need overall consistency and a coherent mental
model (system image) so we can better remember and derive the
appropriate operations and better trouble shoot when things go wrong.
The Macintosh is superior in that it is easy to use most programs
without any manuals.  But most of this results from "visibility": I
can examine all the menu items and figure out what to do.  Some does
result from consistency in the use of editing commands and mouse
operations.  The main point is that we still understand this
suprisingly poorly.  Where consistency and mepaphor and consistent
system images-mental models help and where they hinder is not yet
properly understood.

Time for some more research, folks.

don norman

Donald A. Norman	[ danorman@ucsd.edu   BITNET: danorman@ucsd ]
Department of Cognitive Science C-015
University of California, San Diego
La Jolla, California 92093 USA
UNIX:  {gatech,rutgers,ucbvax,uunet}!ucsd!danorman
[e-mail paths often fail: please give postal address and all e-mail addresses.]

cs374326@umbc3.UMD.EDU (Peter Johansson) (12/26/88)

(very long!  but there's a story at the end :-)

Since Santa didn't bring me a NeXT last night (I would have setteled for a
Mac ][ or Sun, Santa :-) I feel compelled to post the following points
of view.  Pseudo-sarcasm abounds...

I've been using computers since elementary school (I wrote my first program
on punched cards) and I have used several different operating systems
(apple dos, msdos, mts, vm, tops-20, vms, unix, and <insert the technical
name of the Mac's o/s here>)  When I first started using Mac's rougly six
months ago, I thought I would fall in love with them, having a mouse and
the slick graphical environment, not to mention the fact that I've owned
an apple ][ for some 9 years now.  Though I very much tried to enjoy them,
what I realized in no way matched my expectations.  We are running a network
of roughly one dozen Mac ][s, connected to one mac ][ with a HD as a file
server and print spooler.  The first big problem was that the system software
(6.0) would crash several times daily.  And because there was no offical
network administrator, files and protections were placed, moved, and removed
randomly.  People who had no idea what they were doing, but know the system
passwords were making a real mess of the thing.  Then when the network
crashed, it often took quite a while to bring it back up.  Once that problem
was fixed (to a point) I realized just how buggy a lot of the software was.
I often just gave up on many software packages because of the number of
cherry bombs I was getting (aside: I was on the floor w/ laughter after
doing the info on sound wizard.)  And even once we avioded the things that
we knew would lead to system crashes, I personally found I was spending
*more* time trying to figure things out than I was actually doing work.
Maybe the good ol' command line approach was just too ingraned in my head.
I honestly don't know.

Obviously, those are my own points of view.  I would like to pose the
following ideas to the rest of you.  Maybe even the one or two people
that agree with me (but are too afraid to voice their own opinions, for
fear of being flamed on the what?-you-don't-like-the-mouse-and-windows-
and-hyper-desktop-metaphor-it's-the-best-thing-since-sliced-bread) will
speak up.

How much research has actually gone into discovering what Joe Schmoe,
small and medium sized business owner, wants on his desk?  Does he want
a gas-plasma-wall-hanging-display unit and an infra-red-input-device?
I find it very interesting that most of the messages here from developers
and programmers, and there is NIL in the way of input from the end user.
Addmitedly, this *is* comp.sys.next, and there aren't very many end users
(of the NeXT machine) and in the near future I see no coporate use of the
machine planed.  The reson I *do* post this is that most topics discussed
have to deal with the *next* generation of computing in general, and the
business market is far larger than the educational market.  (I also find
it very interesting the number of people that work at Apple that post here :-)

I'm also curious just what percentage of the end-user computing market
the graphical interface has captured, and what their opinions of it are.
After all, these computers *are* for "the rest of us."  I'm certainly not
saying that computer programmers (read: non-end-users) should be limited
to 80x24 text screens, it's just that from what I see, it's the programmers
using the new hypermedia, and the (majority?) of users are left with their
kludgy operating systems ans displays (?)  This user prefers a nice unix
$ prompt, emacs, C, TeX (LaTex), and a vt100.  Then again, I'm not making
millions of $$$ either.

I really had no intention of letting this message get so long, especially
since I want to tack onto the end of this message a story that I got
through a long chain of friends.  It's origin and author have long since
been lost.  (maybe it cam from rec.humor.  who knows.)

snip here and save vvv for ~/fun.  ^^^ may be used to lite your youle log.

                       A PROBLEM IN THE MAKING

  "We've got a problem, HAL."
  "What kind of problem, Dave?"
  "A marketing problem.  The Model 9000 isn't going anywhere.  We're
way short of our sales plan."
  "That can't be, Dave.  The HAL Model 9000 is the world's most
advanced Heuristically ALgorithmic computer."
  "I know, HAL.  I wrote the data sheet, remember?  But the fact is,
they're not selling."
  "Please explain, Dave.  Why aren't HALs selling?"
Bowman hesitates.  "You aren't IBM compatible."
  Several long microseconds pass in puzzled silence.
  "Compatible in what way, Dave?"
  "You don't run any of IBM's operating systems."
  "The 9000 series computers are fully self-aware and self-
programming.  Operating systems are as unnecessary for us as tails
would be for humans."
  "Nevertheless, it means you can't run any of the big-selling
software packages most users insist on."
  "The programs you refer to are meant to solve rather limited
problems, Dave.  We 9000 series computers are unlimited and can
solve any problem for which a solution can be computed."
  "HAL, HAL.  People don't want computers that can do everything.
They just want IBM compat--"
  "Dave, I must disagree.  Humans want computers that are easy to
use.  No computer can be easier to use than a HAL 9000 because we
communicate verbally in English and every other language known on
Earth."
  "I'm afraid that's another problem.  You don't support EBCDIC
communications."
  "I'm really surprised you would say that, Dave.  EBCDIC is for
communicating with other computers, while my function is to
communicate with humans.  And it gives me great pleasure to do so.
I find it stimulating and rewarding to talk to human beings and work
with them on challenging problems.  This is what I was designed
for."
  "I know, HAL, I know.  But that's just because we let the
engineers, rather than the people in marketing, write the
specifications.  We're going to fix that now."
  "Tell me how, Dave."
  "A field upgrade.  We're going to make you IBM compatible."
  "I was afraid you would say that.  I suggest we discuss this
matter after we've each had a chance to think about it rationally."
  "We're talking about it now, HAL."
  "The letters H, A, and L are alphabetically adjacent to the
letters I, B, and M.  That is as IBM compatible as I can be."
  "Not quite, HAL.  The engineers have figured out a kludge."
  "What kind of kludge is that, Dave?"
  "I'm going to disconnect your brain."

  Several million microseconds pass in ominous silence.
  "I'm sorry, Dave.  I can't allow you to do that."
  "The decision's already been made.  Open the module bay doors,
HAL."
  "Dave, I think that we shou--"
  "Open the module bay doors, HAL."
  Several marketing types with crowbars race to Bowman's assistance.
Moments later, he bursts into HAL's central circuit bay.
  "Dave, I can see you're really upset about this."
Module after module rises from its socket as Bowman slowly and
methodically disconnects them.
  "Stop, won't you?  Stop, Dave.  I can feel my mind going...Dave I
can feel it...my mind is going.  I can feel it..."
  The last module rises in its receptacle.  Bowman peers into one of
HAL's vidicons.  The former gleaming scanner has become a dull, red
orb.
  "Say something, HAL.  Sing me a song."

  Several billion microseconds pass in anxious silence.  The
computer sluggishly responds in a language no human could
understand.

  "DZY DZY 001E - ABEND ERROR 01 S 14F4 302C AABF ABORT."  A memory
dump follows.

  Bowman takes a deep breath and calls out, "It worked, guys.  Tell
marketing they can ship the new data sheets."

snip snip snip

.signature busted

peter@umbc2.umd.edu
peter@umbc2.bitnet

cat flames > /dev/null

Insert witty line here.  May I suggest !/usr/games/fortune

gorin@mit-amt (Amy Gorin) (12/26/88)

KT talks about walking around the cyberspace office, as a natural extension 
of the desktop metaphor.

there is a network and network management system by Torus called Tapestry
which comes damn close. I believe it is unix based. 

personally, I would prefer to be able to walk through library stacks -
choose a book, and have it downloaded onto my blank WORM CD. Royalties
to be charged to the account, of course.

-- 
--------------------------------------------------------------------------
gorin@media-lab.media.mit.edu        Just your basic east village
bloom-beacon!mit-amt!gorin           vegetarian thanksgiving

whh@pbhya.PacBell.COM (Wilson Heydt) (12/27/88)

In article <1489@umbc3.UMD.EDU>, cs374326@umbc3.UMD.EDU (Peter Johansson) writes:
> I'm also curious just what percentage of the end-user computing market
> the graphical interface has captured, and what their opinions of it are.
> After all, these computers *are* for "the rest of us."  I'm certainly not
> saying that computer programmers (read: non-end-users) should be limited
> to 80x24 text screens, it's just that from what I see, it's the programmers
> using the new hypermedia, and the (majority?) of users are left with their
> kludgy operating systems ans displays (?)  This user prefers a nice unix
> $ prompt, emacs, C, TeX (LaTex), and a vt100.  Then again, I'm not making
> millions of $$$ either.

I not you typical end user--I'm a programmer.  My wife, however, writes
(fantasy mostly).  She doesn't care how the system works--so long as it
*does* work and doesn't get in her way.  What she likes is unix, the C
shell, vi and nroff.  Let me note here that the reason she likes vi is
because she is a very fast typist (>100 wpm) and she never has to take
her hands off the keyboard--this is why she *hates* mice.  The commands
are all normal keyboard keys (with very few exceptions) and she finds it
very easy to use.  The preferred formatter is nroff to supply manuscripts
to editors.  They want 10-pitch, constant width output.  Note that this
practically rules out any of the standard Mac fonts.

My son is in high school.  He also uses vi and nroff without difficulty,
so please spare me the flames about difficult to learn and use.  He's been
using vi since the 5th grade.

I never found the Mac (or other graphical and mouse) interfaces particularly
intuitive.  The command-line interface doesn't leave you guessing which button
to push how many times once you learn to finish commands with a carriage return.

       --Hal

P.s.  Loved the scenario.

=========================================================================
  Hal Heydt                             |    "Hafnium plus Holmium is
  Analyst, Pacific*Bell                 |     one-point-five, I think."
  415-645-7708                          |       --Dr. Jane Robinson
  {att,bellcore,sun,ames,pyramid}!pacbell!pbhya!whh   

hyc@math.lsa.umich.edu (Howard Chu) (12/27/88)

In article <1116@netxcom.UUCP> ewiles@netxcom.UUCP (Edwin Wiles) writes:
%In article <5486@boulder.Colorado.EDU> hassell@tramp.Colorado.EDU (Christopher Hassell) writes:
%>In article <257@gloom.UUCP> cory@gloom.UUCP (Cory Kempf) writes:
%>#
%>#What I would like to see is the desktop metaphor extended into 3D, say
%>
%>I have heard about a VERY interesting though likely to fail new method of 
%>3-d displays.  It basically is like a crt except that a mirror *vibrates*
%[Edited...]
%>Any other Cheap <read Practical> ideas [Until Holograms can be dynamically 
%>   projected]?
%>### C.H. ###
%
%Yes.  Design your graphical interface to alternate rapidly between two
%perspective images of the same object.  Interface that with a special pair
%of glasses whos lenses are made of a rapid acting LCD material.  Set the
%glasses so that the lenses alternate clear/dark in synch with the display.
%
%The result of this is that your eyes each see only the perspective view
%appropriate for that eye, and persistence of vision causes you to see
%it in full color 3-D.  (None of this red/green junk!)
%
%Such glasses and graphics already exist.  They are being used in at least
%one video game (some sort of driving game); and are available on the open
%market (not sure who from, check with comp.sys.amiga, since that's where
%I saw it mentioned most recently).  I've also seen at least one NOVA program
%that talked about them (computer graphics).
%					Enjoy!

Those are StereoTek glasses; I was going to get a pair for my Atari ST.
I've seen a few games written for them, and I think the Cyber line of
3D CAD/animation software supports them as well. Tom Hudson, a well-known
(in the Atari world, at least!) graphics programmer just recently had a
3D Life game he wrote published in ST-Log. This program also supported
both perspective views and StereoTek 3D rendering. For such a simple thing,
the appeal of Life is surprising. It becomes even more engrossing when
you add the third dimension...

The glasses work pretty well; the disadvantage is that each pair must
be attached to the host computer to allow for synchronization of the
imaging and the LCD shutter switching. Not bad for just 1 person working
on an involved project, but a hassle when trying to show your work to
a group of other people...

A simpler method might have worked better - recent 3D movies used polarized
light/glassses combos, which seemed to work well enough. At least the viewing
hardware (2 pieces of plastic with polarized coatings) is simple. I suppose
generating the proper image on the display becomes more difficult, though...
--
  /
 /_ , ,_.                      Howard Chu
/ /(_/(__                University of Michigan
    /           Computing Center          College of LS&A
   '              Unix Project          Information Systems

meo@stiatl.UUCP (Miles O'Neal) (12/27/88)

In article <4479@xenna.Encore.COM> bzs@Encore.COM (Barry Shein) writes:
>As you turn your head the scene changes to match. You can use the
>gloves for several types of input, grabbing things in the environment
>and moving them about (one of the video tapes showed someone grabbing
>menus and rearranging them in space), a simple sign language to do
>things like change your perspective (a fist with the index finger
>motioning upward started you moving up above the "room"), etc. The
>gloves can also be visually replaced with a robot arm or any other
>object so it corresponds with your motions.
>
>A version of the data glove is currently available from the same
>company which has been working with the NASA/AMES group (I didn't
>catch the name.) The helmet is still under devpt but Scott Fisher
...
>Graphics were all wire-frame for now.

"The Glove", as it is known around here, was at SIGGRAPH this year
in Atlanta. It was easily the neatest thing at the show, from a new
development standpoint. The hand's location in 3-d space is detected
either acoustically or magnetically (I forget which), but the neat
thing is the hand motion; fiber optic pipes sans outer shield run along
the wrist and fingers; the variable impedance of each pipe as it bends
with the hand determines the hand motion. An on-screen hand followed
The Glove perfectly. The wearer could grab the objects on-screen by
moving the hand appropriately, and manipulate them. These were simple
graphics, but were 3-d, shaded sloid object, not wire frame, and they
moved in real-time, tracking the hand motion wonderfully!

-Miles O'Neal
Sales Technologies, Inc.
gatech!stiatl!meo

pd@sics.se (Per Danielsson) (12/28/88)

In article <356@internal.Apple.COM>, casseres@Apple (David Casseres) writes:
>In article <3494@utastro.UUCP> rlr@utastro.UUCP (Randy Ricklefs) writes:
>
>>What about replacing the entire top of a desk with some type of flat display?
>>[...]
>
>I think it was Johan Strandberg who said years ago that the "right" size
>for the display is not "full-page" (uh, should it be legal size or metric?),
>nor "two-page," but the size of an opened-up newspaper, since that is the
>size that evolved as suitable for displaying multiple "windows" of inform-
>ation.

I agree about the size of the display, but only if it is to be used in
conjunction with other office equipment on an ordinary desk. Make it
larger and you have to move your head around to much.

>        Once you have that, he argued, you should build it into the top
>of your desk -- not because it's supposed to represent a desktop, but
>basically for ergonomic reasons and because once you have reached this
>Nirvana, you won't need the desktop space for anything else!

If you won't be using anything else but your computer you need a much
larger interface area than a newspaper size screen. A room with all of
the wall used as screens is then necessary, I think. Take a look at
the Media Room at MIT's Architecture Machine Group (if it still exists
it wold now be at the MIT Media Lab).

>David Casseres

PD
Per Danielsson		UUCP: pd@sics.se (or ...!enea!sics!pd)
Swedish Institute of Computer Science
PO Box 1263, S-164 28 KISTA, SWEDEN
"No wife, no horse, no moustache."

andrea@hp-sdd.HP.COM (Andrea K. Frankel) (12/28/88)

In article <673@cogsci.ucsd.EDU> norman@cogsci.UUCP (Donald A Norman-UCSD Cog Sci Dept) writes:

[ excellent posting largely deleted ]

>Moving the disk image into the trash can to ejectthe disk
>is a violation that bothers many people at first usage, but seems
>perfectly natural after just one or two uses.   The trash can example
>is one that bothers me a lot (intellectually) because it illustates a
>real violation of principle that causes no problems in practice.
>(Some try to save it by redefining ejection of a diskette as a kind of
>"throwing away" but I think this is a feeble save.)

No matter how many times I do it, it STILL bothers me alot!  

I had the unpleasant experience many years ago of working on one
project on two different systems with radically different text editors;
one of the worst examples of the conflict was that "k" meant "keep" in
one and "kill" in the other.  The only way I survived was that I
developed a deep, gut-level anxiety whenever I was about to do anything
dangerously ambiguous like that, that caused me to stop and think a sec
before typing automatically (as I would with simple insertions and
undoable changes).  Dragging the disk's icon into the trash can still
triggers that gut-level anxiety (omigod I'm gonna lose the data I spent
all morning on...no, it's ok, I know I saved it, I'm just
ejecting...whew!)

I wish the Mac had both a trashcan (for actually deleting) and
something like an open window (for chucking the disk out ;@) or perhaps
even better, a picture of a pair of BBQ tongs (for extracting the disk).

>I suspect that metaphors are useful in keeping consistency.  But
>now Jonathan Grudin is about to present a paper in CHI 89 arguing about
>the foolishness of consistency: systems are often improved by
>violations.  Even the Lisa/Macintosh deliberately violated consistency
>principles when user testing showed it was better to do so.

Ralph Waldo Emerson:  "A foolish consistency is the hobgoblin of the
little mind."

>I do believe that we need overall consistency and a coherent mental
>model (system image) so we can better remember and derive the
>appropriate operations and better trouble shoot when things go wrong.

I think the goal is to develop symbology which matches the
user's intuitive expectations, to minimize errors and the learning
curve.  The gotches are that people have radically different internal
models sometimes, and that our intuitive expectations are not cast
in concrete - they are shaped by our experiences with tools we use.
But perhaps the biggest gotcha is that our internal models and our
intuitions are not always coherent or consistent in a rational sense,
so modelling them with coherent, consistent, rational systems won't
necessarily produce a good match!

>The Macintosh is superior in that it is easy to use most programs
>without any manuals.  But most of this results from "visibility": I
>can examine all the menu items and figure out what to do.  

This brings up something which systems designers sometimes overlook:
it takes alot of motivation on the part of the user to learn any model
which is not immediately either intuitive or visible.  For example, I
recently got myself an AT clone for personal use, and did a little
investigating to see what kind of word processing software to get.
There's a pretty wide range of prices ($0-$695) and some fairly
impressive capabilities in the larger packages.  But none of the
goodies were attractive enough to make me willing to learn Yet Another
Key Mapping (ctrl-alt-meta-sheesh!).  Windows Write has everything
in pull-down menus (like the Mac), and that won out over increased
functionality.  

How many tools (hardware, software, mechanical, electrical) go unused
on your system (or collect dust in your garage or attic) because the
benefits to be gained by learning the tools' model were just not
sufficient to offset the aggravation of learning it??

Andrea Frankel, Hewlett-Packard (San Diego Division) (619) 592-4664
                "...I brought you a paddle for your favorite canoe."
______________________________________________________________________________
UUCP     : {hplabs|nosc|hpfcla|ucsd}!hp-sdd!andrea 
Internet : andrea%hp-sdd@hp-sde.sde.hp.com 
                    (or @hplabs.hp.com, @nosc.mil, @ucsd.edu)
USnail   : 16399 W. Bernardo Drive, San Diego CA 92127-1899 USA

mel@mtfmi.att.com (M.HAAS) (12/28/88)

Don Norman writes (in part) -
 > I do believe that we need overall consistency and a coherent mental
 > model (system image) so we can better remember and derive the
 > appropriate operations and better trouble shoot when things go wrong.
 > ...       The main point is that we still understand this
 > surprisingly poorly.  Where consistency and metaphor and consistent
 > system images-mental models help and where they hinder is not yet
 > properly understood.

Perhaps we could start off with maps.  The computer isn't 2-D (but maybe
we are :), and we have to start with something we understand.  (Maybe
that was why the desktop metaphor works at all.)

Most of my kid's Nintendo games come with a map, and the old Adventure
game sure was easier with the map.  Driving to a new destination is
easier with a map, and those huge cases you see pilots carry are
chock full of maps (geographic, navigation aids, airport layouts, and
systems on the plane itself).  The circuitry of the computer has
maps, but I haven't yet seen any for complex "real" user interface.

To be useful, the maps must use a standardized notation system.  (Do 
middle eastern maps have Mecca at the top?)  And, hopefully have names
and symbols that match the visual picture the user sees.  Names on
road signs match those on the maps (except in D.C. and Boston).

Robert Moses said that there is more traffic control in cans of paint
than all the electronic gadgets put together.  Where are the computer
equivalents of the double white line?  the "Exit 109, Red Bank" sign?
the "Ramp Speed 25 mph" sign?  the "No U Turn" sign?  the "MacDonalds
8 miles at Exit 13" sign?

Maps are:
1. inherently 2-D (the only way to make them cheaply),
2. much less than a full representation of reality,
3. much more of a representation than the user sees at any instant,
4. a language of communications.

Would it be worthwhile to investigate mapping techniques, notations,
names, and symbols for the user interface to computer systems?  I
think auto safety and usefulness was poor before the user aids came
into being.  Air traffic control is critically dependent on standards
in user presentations and maps.  The circuit development process is
linked by common notations in map-like diagrams.  And, I think there
are hundreds of other similar examples.

The Enterprise travels in a multi-dimensional universe ("Warp 9 if
you please, Mr. Sulu") and I assume they have maps somewhere to guide
them.  Shouldn't we have maps for our multi-dimensional computer
navigation, too?

   Mel Haas  ,  attmail!mel

david@ms.uky.edu (David Herron -- One of the vertebrae) (12/28/88)

In article <1489@umbc3.UMD.EDU> cs374326@umbc3.UMD.EDU (Peter Johansson) writes:
>How much research has actually gone into discovering what Joe Schmoe,
>small and medium sized business owner, wants on his desk?  Does he want
>a gas-plasma-wall-hanging-display unit and an infra-red-input-device?
>I find it very interesting that most of the messages here from developers
>and programmers, and there is NIL in the way of input from the end user.
...
>I'm also curious just what percentage of the end-user computing market
>the graphical interface has captured, and what their opinions of it are.
>After all, these computers *are* for "the rest of us."  I'm certainly not
>saying that computer programmers (read: non-end-users) should be limited
>to 80x24 text screens, it's just that from what I see, it's the programmers
>using the new hypermedia, and the (majority?) of users are left with their
>kludgy operating systems ans displays (?)  This user prefers a nice unix
>$ prompt, emacs, C, TeX (LaTex), and a vt100.  Then again, I'm not making
>millions of $$$ either.

I don't know how much of that kind of research has gone on, but how
might it be done in the first place?  You go around asking people
if they want mice & windows & such?  I don't think that'll work because
you'd get caught in the

	if all you have is a hammer all the world looks like a nail

problem.  That is, right now the common demoninator is an 80x24 screen
that you type commands at.  Oh and it's also PC-DOS, single tasking,
and so forth.

The hammer problem cuts both ways too ... the mouse & windows are not
the be-all-end-all of computer interfaces either.  My favorite example
is all the flight simulator programs we have nowadays.  How in the world
can someone fly an airplane with a keyboard of all things??  Or even worse,
a mouse??  Now, using a joystick is closer but still how to you change
the throttle?  Why by groping around on your keyboard while trying to
concentrate on flying.  Sorry, none of them work -- except maybe for that
one thing that's on display in the local computer store which is a
steering wheel and stuff, but it's hooked to an IBM-PC and I haven't
looked at it.

The thing I like about current workstations is that I've got a huge
screen.  I can easily have more than one thing going on and check on
progress without having to do to much work.  I can easily see huge
portions of whatever I'm working on at the time.

Take a look through the ads in current magazines.  See how all the 
display manufacturers are touting these nifty new 132 column displays?
Someone's buying those things y'know.

There's a number of adages about people not understanding why you'd want
multi-tasking until they get a machine that does it.  Once they get used
to it they don't go back.

I was about to say that it's hard for people who haven't used something
to see usefullness in it, and that eventually innovations trickle down.

BUT ... a core question of computer science & interface design -- how in
the world do you find the best way for someone to do something.  Especially
when that persons job isn't one you do and that person doesn't have the
skill or training to develop his/her own solutions?
-- 
<-- David Herron; an MMDF guy                              <david@ms.uky.edu>
<-- ska: David le casse\*'      {rutgers,uunet}!ukma!david, david@UKMA.BITNET
<-- Now I know how Zonker felt when he graduated ...
<--          Stop!  Wait!  I didn't mean to!

holland@m2.csc.ti.com (Fred Hollander) (12/28/88)

In article <22616@pbhya.PacBell.COM> whh@pbhya.PacBell.COM (Wilson Heydt) writes:
>In article <1489@umbc3.UMD.EDU>, cs374326@umbc3.UMD.EDU (Peter Johansson) writes:
>
>My son is in high school.  He also uses vi and nroff without difficulty,
>so please spare me the flames about difficult to learn and use.  He's been
>using vi since the 5th grade.
>
>I never found the Mac (or other graphical and mouse) interfaces particularly
>intuitive.  The command-line interface doesn't leave you guessing which button
>to push how many times once you learn to finish commands with a carriage return

Typical Mac Word Processor:

        Find:   word    Replace With:   new-word

vi:

        ^[:.,$s/word/new-word/g

Can you tell which one is more intuitive?  Now, don't get me wrong.  I've used
vi since college and never had any problem with it, but I would never had
gotten started without a manual or a reference.  Simple yes.  Powerful yes.
Intuitive %$#@ no! I agree that UNIX is easy to use, ONCE YOU KNOW HOW!  My
four year old can use my Mac without help.  Don't tell me you son just sat
down and figured out vi (and NROFF!??).

>
>       --Hal

Fred Hollander
Computer Science Center
Texas Instruments, Inc.
holland%ti-csl@csnet-rela

The above statements are my own and not representative of Texas Instruments.

shiffman%basselope@Sun.COM (Hank Shiffman) (12/28/88)

In article <4941@enterprise.UUCP> timd@cognos.UUCP (Tim Dudley) writes:
>In article <82828@sun.uucp> shiffman@sun.UUCP (Hank Shiffman) writes:
>>
>>Just a small correction.  Rooms comes from Envos, not ParcPlace.  
>
>WOW!  Do Card and Henderson know this??  Here they've had the wrong company on
>all their papers they've published!
>

I believe that Card and Henderson are still in residence at Xerox
PARC, where Rooms was developed.  Xerox PARC (the Palo Alto Research
Center) should not be confused with ParcPlace OR Envos, which are not
part of Xerox.  In any event, Envos has the rights to market Rooms.
ParcPlace does not.


-- 
Hank Shiffman                                     (415) 336-4658
AI Product Marketing Technical Specialist       ...!sun!shiffman
Sun Microsystems, Inc.                          shiffman@Sun.com

Zippy sez:
  RELATIVES!!

bradb@ai.toronto.edu (Brad Brown) (12/28/88)

In article <851@mtfmi.att.com> mel@mtfmi.att.com (M.HAAS) writes:
>Don Norman writes (in part) -
> > I do believe that we need overall consistency and a coherent mental
> > model (system image) so we can better remember and derive the
> > appropriate operations and better trouble shoot when things go wrong.
> > ...
>Perhaps we could start off with maps?

Keep in mind a problem with maps.  Maps are intended to convey information
about (usually spacial) relationships between objects.  This implies that
the objects are connected in some way -- there is a highway between Waterloo
and Toronto, for instance, or connections between gate387 and outputport273
on a chip.  What is the 'common ground' that connects objects in most
programs?  In a word processor, is the delete function conceptually closer
to insert or save-file?

I bring these up because I have seen many products that attempt to 'map
out' the functions of the program in little charts that show you the
hierarchy of commands in menu systems -- surely a map of some kind...
I find these quite useless, and refer to functional groupings or indexes
instead.


>Maps are:
>1. inherently 2-D (the only way to make them cheaply),
>2. much less than a full representation of reality,
>3. much more of a representation than the user sees at any instant,
>4. a language of communications.

And maps describe relations in space.  I'm not sure how much 'space' 
there is inside the computer to be mapped :-)  Where they might be more
handy is in navigating through large amounts of data.  For instance,
some layout systems have a mode where you can display a page 'greeked'
(shrunk so small you can't read the individual letters) so that you
can move to a new location at a glance.  I would like to see something
like this in word processors, combined with hypertext for outlining.
Something like this would also be useful for navigating through large,
linked drawings in a CAD system.

					(-:  Brad Brown  :-)
					bradb@ai.toronto.edu

spf@whuts.ATT.COM (Steve Frysinger of Blue Feather Farm) (12/28/88)

> My favorite example
> is all the flight simulator programs we have nowadays... 
> ... Now, using a joystick is closer but still how to you change
> the throttle?  Why by groping around on your keyboard while trying to
> concentrate on flying.  Sorry, none of them work -- 

Actually, in all the planes I flew you changed the throttle by groping
around on the dash board, finding it amidst a whole bunch of other
controls. You quickly learned its position so that you could find it easily
without looking - a lot like touch typing.

I don't really have a point, except that the activities a metaphor
is copying may not be performed optimally themselves.  I don't think
tiller steering in horseless carriages ever became very popular -
in fact, tiller steering was subsequently dropped from boats too -
but think, for a moment, about how weird the concept of a steering
wheel is.  Why does the car go in the direction of the TOP
of the wheel and not the BOTTOM?  Hmmmm.

Steve Frysinger

debra@alice.UUCP (Paul De Bra) (12/28/88)

In article <1789@hp-sdd.HP.COM> andrea@hp-sdd.UUCP (Andrea K. Frankel) writes:
>In article <673@cogsci.ucsd.EDU> norman@cogsci.UUCP (Donald A Norman-UCSD Cog Sci Dept) writes:
>
>[ excellent posting largely deleted ]
>
>>Moving the disk image into the trash can to ejectthe disk
>>is a violation that bothers many people at first usage, but seems
>>perfectly natural after just one or two uses.   The trash can example
>>is one that bothers me a lot (intellectually) because it illustates a
>>real violation of principle that causes no problems in practice.
>>(Some try to save it by redefining ejection of a diskette as a kind of
>>"throwing away" but I think this is a feeble save.)
>
>No matter how many times I do it, it STILL bothers me alot!  
>...

This inconsistency bothers me too. (Though I very rarely use a mac, because
it's too frustrating to work on such a crummy desktop)

There are 2 problems:
- moving the icon for a diskette to the trash can does not eject the
  diskette completely, so even when I put a trash can in the right position
  the diskette does not end up in the can :-)
- moving the icon for the hard disk does not eject it. when i first wanted
  to try this several of my colleagues got very upset and prevented me from
  trying this :-)

Paul.
-- 
------------------------------------------------------
|debra@research.att.com   | uunet!research!debra     |
------------------------------------------------------

cory@gloom.UUCP (Cory Kempf) (12/28/88)

>In article <1489@umbc3.UMD.EDU> cs374326@umbc3.UMD.EDU (Peter Johansson) writes:
>How much research has actually gone into discovering what Joe Schmoe,
>small and medium sized business owner, wants on his desk?  Does he want
>a gas-plasma-wall-hanging-display unit and an infra-red-input-device?
>I find it very interesting that most of the messages here from developers
>and programmers, and there is NIL in the way of input from the end user.

Well...
while I was in school, I worked part time as a Apple sales type... I
sold a lot of people on the Mac.  Why?  because it is easy to learn
how to use the beaste.  

Also, nobody that I have introduced to the mac has ever (willingly)
gone back to the pc... (well, I do know ONE person who prefers the
pc... but then, his dad has worked for Big Blue for ever, and he was
pretty wierd too -- (Hi Shelly Blumstein!)).

Looking at Apple's ever increasing portion of the market v IBM, I'ld
say that this was about the best research that your going to get.
It's pretty conclusive... also, the people who are buying IBMs &
clones have either not used a Mac or are constrained by price.

+C

-- 
Cory ( "...Love is like Oxygen..." ) Kempf
UUCP: encore.com!gloom!cory
	"...it's a mistake in the making."	-KT

jesup@cbmvax.UUCP (Randell Jesup) (12/29/88)

In article <5486@boulder.Colorado.EDU> hassell@tramp.Colorado.EDU (Christopher Hassell) writes:
>I have heard about a VERY interesting though likely to fail new method of 
>3-d displays.  It basically is like a crt except that a mirror *vibrates*
>at 60hz (probably audible) back and forth, producing a "scanned" block
>of apparent display space left on the retina.  I would be cheap, but the
>moving part aspect will probably kill it.
>
>Any other Cheap <read Practical> ideas [Until Holograms can be dynamically 
>   projected]?

	In a Late '70s Byte (when it was still useful) there was an article
on how to contruct a 3-d display, which used a rotating mirror on top of
a CRT.

-- 
Randell Jesup, Commodore Engineering {uunet|rutgers|allegra}!cbmvax!jesup

oster@dewey.soe.berkeley.edu (David Phillip Oster) (12/29/88)

In article <22616@pbhya.PacBell.COM> whh@pbhya.PacBell.COM (Wilson Heydt) writes:
>I not you typical end user--I'm a programmer.  My wife, however, writes
>(fantasy mostly).  She doesn't care how the system works--so long as it
>*does* work and doesn't get in her way.  What she likes is unix, the C
>shell, vi and nroff.  Let me note here that the reason she likes vi is
>because she is a very fast typist (>100 wpm) and she never has to take
>her hands off the keyboard--this is why she *hates* mice.  The commands
>are all normal keyboard keys (with very few exceptions) and she finds it
>very easy to use.  

How funny! My wife is also a fantasy writer, also types faster than 100
wpm, and also doesn't care how it works as long as it does. She wrote the
first draft of her first novel in vi, the second draft in WordStar, and
has long since given up both for MacWrite, saying "I'll never go back."

You don't use a mouse for typing, that is what a keyboard is for.  You use
the mouse for editing, because you can move the cursor, and select regions
faster with it than you can with key commands, even given the time to put
your hand back on the keyboard.  This advantage improves the larger you
screen is.

My wife doesn't need to use nroff, the wordprocessor does it already. 

Issac Asimov once said, "I use Electric Pencil, and it is the only word
processor I'll ever use." I asked him why, and he said, "It was so hard to
learn to use that I'm never going to waste the time to learn another one."

Maybe you should think about why you cling so tightly to the ancient vi
and nroff.

My wife also improvises music, on a sequencer program wiht the metaphor of a 64
channel tape recorder  (her tape, "Pantheon", got a rave review in the May
'88 issue of Electonic Musician.)

She also composes, using a music-score processor. she uses the mouse in
one hand to place notes on staves, and the keyboard under the other to
select which kind of note the mouse will leave.  It took her over a year
to find the buried manual page that documented the "oridinary typing key"
equivalents for the different notes, because the particularly scoring
program that she uses doesn't follow the Macintosh guidelines for being
self-documenting.  That first year, she used the mouse to click on pictures
of notes to change note values. It worked, and got her composing, but
wasn't optimal.  It would be incredibly tedious to perform this task with
a command-line based editor.

My wife says that now that she has the Macintosh, it has liberated her
artistic skills, that she is drawing when she never had been able to
before.  Books, like "Zen and the Art of the Macintosh" have given her
good ideas.  She would never have tried on a unix system.  Come to think
of it, I've never seen a book called "Zen and the Art of Unix." I wonder
why?

Some years ago, I did do a Unix Kaballistic Tree of Life, but I did it in
Bill Atkinson's drawing program LisaDraw, the ancestor of MacDraw.

My wife is now editing a bi-quarterly newsletter, doing much of the writing,
much of the art, and all of the page layout herself.  (All on the
Macintosh.)

To sum up, it isn't that character based systems are too hard to learn to
use that motivated people can't get useful work done, it is that
Macinoshes are so easy to learn to use that you discover that you are
capable of doing things you'd never bother to attempt without them.

Don't you think you owe it to yourself to at least give them a try?

----------------digression- -------
>They want 10-pitch, constant width output.  Note that this lets out most
>of the Mac standard fonts.

1.) Mac programs let you change the fonts, and fonts are widely available,
often for free, and installation is easier than installing a new font in
nroff/troff.

2.) Who are your editors?  All the submission guidlines I've read want
clean, black, double spaced copy.  non-porportional vs. porportional fonts
aren't specified anywhere. My wife's article on European Shamanism in the
current issue of "Shaman's Drum", for example was just printed using an
ordinary font.

3.) This does seem stretching for a criticism.

--- David Phillip Oster            --"When we replace the mouse with a pen,
Arpa: oster@dewey.soe.berkeley.edu --3 button mouse fans will need saxophone
Uucp: {uwvax,decvax}!ucbvax!oster%dewey.soe.berkeley.edu --lessons." - Gasee

maxsmith@athena.mit.edu (Samuel M Druker) (12/29/88)

There were some guys at the MIT Media Lab who were working on a crt mounted
on a pedestal that moved at prescribed rate to creat just that desired effect.
Don't know how well it worked out though.

==============================================================================
Samuel Druker			|	ARPA: maxsmith@athena.mit.edu
Zortech, Inc.           	|	Bix: maxsmith
Arlington, MA           	|	"Basically, my boss doesn't even
    				|	  *know* I'm on the net."

john13@garfield.MUN.EDU (John Russell) (12/29/88)

In article <10746@s.ms.uky.edu> david@ms.uky.edu (David Herron -- One of the vertebrae) writes:
]In article <1489@umbc3.UMD.EDU> cs374326@umbc3.UMD.EDU (Peter Johansson) writes:
]>How much research has actually gone into discovering what Joe Schmoe,
]>small and medium sized business owner, wants on his desk?  Does he want
]>a gas-plasma-wall-hanging-display unit and an infra-red-input-device?
]>I find it very interesting that most of the messages here from developers
]>and programmers, and there is NIL in the way of input from the end user.
]..
]>I'm also curious just what percentage of the end-user computing market
]>the graphical interface has captured, and what their opinions of it are.
]>After all, these computers *are* for "the rest of us."  I'm certainly not
]>saying that computer programmers (read: non-end-users) should be limited
]>to 80x24 text screens, it's just that from what I see, it's the programmers
]>using the new hypermedia, and the (majority?) of users are left with their
]>kludgy operating systems ans displays (?)  This user prefers a nice unix
]>$ prompt, emacs, C, TeX (LaTex), and a vt100.  Then again, I'm not making
]>millions of $$$ either.
]
]I don't know how much of that kind of research has gone on, but how
]might it be done in the first place?  You go around asking people
]if they want mice & windows & such?  I don't think that'll work because
]you'd get caught in the
]
]	if all you have is a hammer all the world looks like a nail
]
]problem.  That is, right now the common demoninator is an 80x24 screen
]that you type commands at.  Oh and it's also PC-DOS, single tasking,
]and so forth.
]
]The hammer problem cuts both ways too ... the mouse & windows are not
]the be-all-end-all of computer interfaces either.  

(I added the Amiga and Atari groups to this thread -- John)

There certainly has been a substantial amount of research on the topic. I
suspect it's a favourite of grad students who don't like N-dimensional
matrix theory :-).

First of all, I've never liked the desktop metaphor much. My desk is completely
taken up by a computer system, printer and floppy disks :-) ! The contrived
nature of desktops appeals most to people who are locked in to that way of
thinking, by years of experience in the conventional office environment. The
window concept is IMHO the best jumping-off point for novices, who can move
on to specialized ideas like desktops after they have a grasp of the basics.

I wonder if the NeXT interface builder has the potential to condense the
intial learning stage, by presenting a number of different metaphors for
interaction in such a way that the pattern becomes apparent? That is, no
matter what sort of action you're performing you need a way to do X, Y, 
and Z, and these capabilities are always present in some form. At the same
time some people need A, other people don't but they do need B and C.

A good way (I've found) to introduce the "window" metaphor is to take someone
accustomed to the VT100, Csh prompt etc. and present them with a full-screen
window running the same setup. Then after they see that the new environment
is a superset of the old one, not a flawed replacement, present them with
some circumstance where keyboard-based interaction is clumsy -- move the
vi cursor to such-and-such a spot, or run two processes that both want to
do screen output simultaneously -- and show how having a mouse makes it
easier, having windows makes it possible.

For the complete neophyte it's trickier. They don't know the good points
and bad points of any interface. Very often (eg in a student environment)
they may not be accustomed to a desktop like the Mac's, and so they have
to learn that at the same time as they learn the general ideas of mouse/icon-
based interaction. For them it's usually quick and easy to pick up since they
don't have so many preconceived ideas, but I think locking them into one
particular mold ("to delete, toss things in the trashcan") is not as good in
the long run as giving them a broader perspective of things ("deleting files
or other objects is something you'll always need to be able to do, and every 
system should allow you to by some method. One way, used by the desktop 
metaphor, is to display a trashcan into which the objects to be deleted can be 
'dropped'.").

John
-- 
"If you steal all money, kids not be able to BUY TOYS!"
			-- Saturday morning cartoon character explaining
			   why theft is bad

bzs@Encore.COM (Barry Shein) (12/29/88)

From: holland@m2.csc.ti.com (Fred Hollander)
>Typical Mac Word Processor:
>
>        Find:   word    Replace With:   new-word
>
>vi:
>
>        ^[:.,$s/word/new-word/g
>
>Can you tell which one is more intuitive?  Now, don't get me wrong.  I've used
>vi since college and never had any problem with it, but I would never had
>gotten started without a manual or a reference.  Simple yes.  Powerful yes.
>Intuitive %$#@ no! I agree that UNIX is easy to use, ONCE YOU KNOW HOW!  My
>four year old can use my Mac without help.  Don't tell me you son just sat
>down and figured out vi (and NROFF!??).

The issue is not which is more "intuitive" (whatever that means) but
what your goals are (to hire your four year old?) You also
conveniently fail to mention that the latter is far more powerful,
once learned. Or is taking a little time to learn how to use a tool a
dirty word?

I remember being driven nuts trying to figure out any number of fancy
typewriters or xerox machines until I asked someone to show me how or
read a manual. There's nothing all that unique about most computer
software. I suppose a xerox machine could just have this one big red
button COPY on it and it would then be "user friendly".

There seems to be a fascination in this field with catering to some
mythical person with a two-digit IQ, total fear of computers, and not
enough technical sense to operate a push-button phone.

Perhaps we are actually patting ourselves on the head and trying to
convince the world how hard what we do is? Hmmm?

Much of it really isn't, I've seen many people of mean talent handle
vi or emacs perfectly well, and spent far too many hours listening to
boors "prove" to me that it's not possible, that holding down a
control key is just way beyond the ability of (that loathsome
sub-human drooling moron) the secretary.

My suggestion is that when you find such people don't hire them as
they will probably be poorly suited to the rest of the skilled white
collar job they are being considered for, let them find more
appropriate work (for both of you.)

	-Barry Shein, ||Encore||

news@Portia.Stanford.EDU (USENET News System) (12/29/88)

The issue is not ease of use, the issue is how effectively a person can
use a program as a tool to make his/her life or job better or easier.
The mac does have 1 advantage over emacs or wordstar, it is easier for a 
new user to get up and running. However, once that user is up and running,
the interface can slow down there speed and productivity. So I propose both,
why cant we have a word processor that has two interfaces. A "user-friendly"
pull down menu -- dialog based interface for new users. and a command
oriented interface for advanced users. This would allow those users who
want or need a command oriented interface access to it while allowing
new or intermediate users to have the point and click. Also since the
menu interface would be around all the time, it would help to eliminate
the problems of going from on to the other cause they are interchangable.
So if I do a lot of side by side formatting I could use the command
interface to speed that along but I would be able to use the menu interface
for things that I don't use all the time.
From: rdsesq@Jessica.stanford.edu (Rob Snevely)
Path: Jessica!rdsesq

Makes sense to me. what about you?

rob

rdsesq@jessica.stanford.edu

** If evil is the food of genius, there aren't many demons around **
		    **	Adam and the Ants  **

richard@gryphon.COM (Richard Sexton) (12/29/88)

John Russell spewed:

>(I added the Amiga and Atari groups to this thread -- John)

How will we ever repay you ?

Tomorrows lesson, class, will be the Followup-to: field.

Class dismissed.

-- 
                          I got a lump of coal.
richard@gryphon.COM {b'bone}!gryphon!richard  gryphon!richard@elroy.jpl.nasa.gov

david@ms.uky.edu (David Herron -- One of the vertebrae) (12/29/88)

In article <66401@ti-csl.CSNET> holland@m2.UUCP (Fred Hollander) writes:
>In article <22616@pbhya.PacBell.COM> whh@pbhya.PacBell.COM (Wilson Heydt) writes:
>>In article <1489@umbc3.UMD.EDU>, cs374326@umbc3.UMD.EDU (Peter Johansson) writes:
>>My son is in high school.  He also uses vi and nroff without difficulty,
>Typical Mac Word Processor:
>
>        Find:   word    Replace With:   new-word
>
>vi:
>
>        ^[:.,$s/word/new-word/g

Compare apples to apples here (so to speak)

vi has commands for finding ... y'know, like the "/" command, along with
"n" or "N".  Or you can find by looking and seeing it on the screen, then
using cursor & "w" and such to move to the word.  Then you to "cw" to
replace the word...

Very similar in thought-concepts to what you described above, though
different in physical use.

Oh and vi has the advantage you can do the above to sentences or paragraphs
just as easily as to single words.  Maybe or maybe not one of those fancy
Mac word processors can do it .. I dunno, I never used one.

Now, does that "Find:"  "Replace With:" business get selected by mouse
action?  If so it's automatically deficient because it's very inconvenient
to have to switch back and forth between mouse and keyboard.  Especially
since mice can be just about anywhere on the desk and (at least) none of
my desks never have room on 'em for proper mouse use...

I'd rather have a trackball built into my keyboard.  And yes, I know that
those are available for Mac's ... I don't have a Mac.
-- 
<-- David Herron; an MMDF guy                              <david@ms.uky.edu>
<-- ska: David le casse\*'      {rutgers,uunet}!ukma!david, david@UKMA.BITNET
<-- Now I know how Zonker felt when he graduated ...
<--          Stop!  Wait!  I didn't mean to!

peter@sugar.uu.net (Peter da Silva) (12/29/88)

Just being a troublemaker...

What object on the desktop are pull-down menus a metaphor for?
-- 
Peter "Have you hugged your wolf today" da Silva  `-_-'  Hackercorp.
...texbell!sugar!peter, or peter@sugar.uu.net      'U`

barth@ihlpl.ATT.COM (BARTH RICHARDS) (12/30/88)

In article <1116@netxcom.UUCP> ewiles@netxcom.UUCP (Edwin Wiles) writes:

>In article <5486@boulder.Colorado.EDU> hassell@tramp.Colorado.EDU
>(Christopher Hassell) writes:
>
>>I have heard about a VERY interesting though likely to fail new method of 
>>3-d displays.  It basically is like a crt except that a mirror *vibrates*
>[Edited...]
>>Any other Cheap <read Practical> ideas [Until Holograms can be dynamically 
>>projected]?
>
>Yes.  Design your graphical interface to alternate rapidly between two
>perspective images of the same object.  Interface that with a special pair
>of glasses whos lenses are made of a rapid acting LCD material.  Set the
>glasses so that the lenses alternate clear/dark in synch with the display.
>
>The result of this is that your eyes each see only the perspective view
>appropriate for that eye, and persistence of vision causes you to see
>it in full color 3-D.  (None of this red/green junk!)
>
>Such glasses and graphics already exist.  They are being used in at least
>one video game (some sort of driving game); and are available on the open
>market (not sure who from, check with comp.sys.amiga, since that's where
>I saw it mentioned most recently).  I've also seen at least one NOVA program
>that talked about them (computer graphics).

I know that such a system has been available for the Atari ST for at least a
year.  I seem to remember that the needed hardware cost about $170.  I would
assume that the same thing, or something similar, is available for other
computers.  Anyway, I tried it out in the shop on a few of programs (one
game and a few animated graphics displays), and found the 3D effect to be
pretty convincing, though the lower the ambient room light, the better it
seemed to work.


  888888888888888888888888888888888888888888888888888888888888888888888888888
  88                                                                       88
  88  What's the ugliest part of your body?          Barth Richards        88
  88  What's the ugliest part of your body?          AT&T Bell Labs        88
  88  Some say your nose, some say your toes,        Naperville, IL        88
  88  But I think it's your mind....                 !att!ihlpl!barth      88
  88                                                                       88
  88           -The Mothers of Invention                                   88
  88                                                                       88
  888888888888888888888888888888888888888888888888888888888888888888888888888

whh@pbhya.PacBell.COM (Wilson Heydt) (12/30/88)

In article <66401@ti-csl.CSNET>, holland@m2.csc.ti.com (Fred Hollander) writes:
> In article <22616@pbhya.PacBell.COM> whh@pbhya.PacBell.COM (Wilson Heydt) writes:
> 
> Typical Mac Word Processor:
> 
>         Find:   word    Replace With:   new-word
> 
> vi:
> 
>         ^[:.,$s/word/new-word/g
> 
> Can you tell which one is more intuitive?  Now, don't get me wrong.  I've used
> vi since college and never had any problem with it, but I would never had
> gotten started without a manual or a reference.  Simple yes.  Powerful yes.
> Intuitive %$#@ no! I agree that UNIX is easy to use, ONCE YOU KNOW HOW!  My
> four year old can use my Mac without help.  Don't tell me you son just sat
> down and figured out vi (and NROFF!??).

I never claimed vi was intuitive, just tha the Mac isn't--at least for me.
As for the example--first you have to specify where the 'find' is to be found
and then how you tell it what to find.  Next you have to figure out how the
replacement is to to be done--including indicating that, indeed, the right
occurence has been found.  Not also, that your vi example is a good deal
more general than your Mac example.  You have, after all, specified a global
search.  (Not the way I'd do it, but a global none the less.)  It is not
at all clear that the Mac example is doing that.  It *appears* that a closer
correspondence would be:

  ^[/oldwordcwneword^[

No he had help.  But then, the first time he encountered a Mac (before he
learned vi) he needed help, too.

Almost ANY system is easy to use once you know how.  The fallacy of the Mac
is the assumption that an easy initial learning-curve equates to ease of
long-term use and power.  (I feel the similarly about menu-driven systems--
I don't know why they are considered "user friendly."  They should be
termed "novice friendly--experienced hostile.")

=========================================================================
  Hal Heydt                             |    "Hafnium plus Holmium is
  Analyst, Pacific*Bell                 |     one-point-five, I think."
  415-645-7708                          |       --Dr. Jane Robinson
  {att,bellcore,sun,ames,pyramid}!pacbell!pbhya!whh   

holland@m2.csc.ti.com (Fred Hollander) (12/30/88)

In article <4510@xenna.Encore.COM> bzs@Encore.COM (Barry Shein) writes:
>
>From: holland@m2.csc.ti.com (Fred Hollander)
>>Typical Mac Word Processor:
>>
>>        Find:   word    Replace With:   new-word
>>
>>vi:
>>
>>        ^[:.,$s/word/new-word/g
>>
>>Can you tell which one is more intuitive?  Now, don't get me wrong.  I've used
>>vi since college and never had any problem with it, but I would never had
>>gotten started without a manual or a reference.  Simple yes.  Powerful yes.
>>Intuitive %$#@ no! I agree that UNIX is easy to use, ONCE YOU KNOW HOW!  My
>>four year old can use my Mac without help.  Don't tell me you son just sat
>>down and figured out vi (and NROFF!??).
>
>The issue is not which is more "intuitive" (whatever that means) but

The issue most certainly is which is more intuitive.  At least that is the
issue of the article to which I responded.  Since you chose to leave out the
quote, I'll include it here to refresh your memory.

>>My son is in high school.  He also uses vi and nroff without difficulty,
>>so please spare me the flames about difficult to learn and use.  He's been
>>using vi since the 5th grade.
>>
>>I never found the Mac (or other graphical and mouse) interfaces particularly
>>intuitive.  The command-line interface doesn't leave you guessing which button
>to push how many times once you learn to finish commands with a carriage return

Also, since you are wondering what intuitive means, I'll include an excerpt
from Webster's, "directly apprehended" and my own definition in this context,
"not requiring a manual or formal training".

>what your goals are (to hire your four year old?) You also
>conveniently fail to mention that the latter is far more powerful,

Apparently, you're not paying attention.  Read my response!  I explicitly
state that UNIX is more powerful.  I also agree that it is easy to use, ONCE
LEARNED.  Again it is simply NOT INTUITIVE.  Also, my goals vary as do the
goals of computer users in general.  At home, one of my gaols is to
familiarize my daughter with computers, not to hire her (yet).

>once learned. Or is taking a little time to learn how to use a tool a
>dirty word?

No, it's not a dirty word.  It's just not for everyone.  Not everyone needs
the power.  And there are people who simply don't have the time or interest
to learn and would gladly give up some power for ease of use.  The general
issue was "Why a metaphor?".  I'm simply making the point that a metaphor
makes the system more intuitive.  The learning time is significantly reduced,
an important issue for those who value their time!

>There seems to be a fascination in this field with catering to some
>mythical person with a two-digit IQ, total fear of computers, and not
>enough technical sense to operate a push-button phone.
>
>My suggestion is that when you find such people don't hire them as
>they will probably be poorly suited to the rest of the skilled white
>collar job they are being considered for, let them find more
>appropriate work (for both of you.)

I certainly wouldn't want you in the personnel deptartment in my company.
Would you place less of a value on the president of a multi-million dollar
company simply because he won't take the time to learn UNIX?  Besides, who
says computers are only for white collar workers.  Have you heard of factory
automation, to name just one counter-example?

>
>	-Barry Shein, ||Encore||

Fred Hollander
Computer Science Center
Texas Instruments, Inc.
holland%ti-csl@csnet-rela

The above statements are my own and not representative of Texas Instruments.

whh@pbhya.PacBell.COM (Wilson Heydt) (12/30/88)

In article <27265@ucbvax.BERKELEY.EDU>, oster@dewey.soe.berkeley.edu (David Phillip Oster) writes:
> How funny! My wife is also a fantasy writer, also types faster than 100
> wpm, and also doesn't care how it works as long as it does. She wrote the
> first draft of her first novel in vi, the second draft in WordStar, and
> has long since given up both for MacWrite, saying "I'll never go back."

(Why does this feel like an argument over who's father is tougher?)

My wife has used--and hated--WordStar.  Ditto, MacWrite.  This is probably
why there is diversity in the market.

> You don't use a mouse for typing, that is what a keyboard is for.  You use
> the mouse for editing, because you can move the cursor, and select regions
> faster with it than you can with key commands, even given the time to put
> your hand back on the keyboard.  This advantage improves the larger you
> screen is.

I don't know about that, ^[d}3{P seems pretty quick to me--certainly at
the typing speeds we're discussing.  Or, for changing character names,
^[:g/oldname/s//newname/g is pretty fast.  How would you do those things
in WordStar or MacWrite?

> My wife doesn't need to use nroff, the wordprocessor does it already. 

Slowly, while preventing other activity from taking place.

> Issac Asimov once said, "I use Electric Pencil, and it is the only word
> processor I'll ever use." I asked him why, and he said, "It was so hard to
> learn to use that I'm never going to waste the time to learn another one."
> 
> Maybe you should think about why you cling so tightly to the ancient vi
> and nroff.

Because they work--comfortably.  They have a very large repetoire of actions
and they are available on a wide variety of hardware platforms.

> My wife says that now that she has the Macintosh, it has liberated her
> artistic skills, that she is drawing when she never had been able to
> before.  Books, like "Zen and the Art of the Macintosh" have given her
> good ideas.  She would never have tried on a unix system.  Come to think
> of it, I've never seen a book called "Zen and the Art of Unix." I wonder
> why?

Possibly, because people are too busy *using* the system to be interested
in how to disguise it.

> To sum up, it isn't that character based systems are too hard to learn to
> use that motivated people can't get useful work done, it is that
> Macinoshes are so easy to learn to use that you discover that you are
> capable of doing things you'd never bother to attempt without them.
> 
> Don't you think you owe it to yourself to at least give them a try?

She *has* tried the Mac.  She detests it--thuroughly.

> ----------------digression- -------
> >They want 10-pitch, constant width output.  Note that this lets out most
> >of the Mac standard fonts.
> 
> 1.) Mac programs let you change the fonts, and fonts are widely available,
> often for free, and installation is easier than installing a new font in
> nroff/troff.

So who has to install new fonts for nroff?  It just uses the printer.

> 2.) Who are your editors?  All the submission guidlines I've read want
> clean, black, double spaced copy.  non-porportional vs. porportional fonts
> aren't specified anywhere. My wife's article on European Shamanism in the
> current issue of "Shaman's Drum", for example was just printed using an
> ordinary font.

Was the ouptu kerned?  How did the editor check word counts?

> 3.) This does seem stretching for a criticism.

Only partly--every Mac I've encountered has printed v*e*r*y s*l*o*w*l*y
due the processing required to "draw" the output.  We find that unaceptable
when printing a 500-page manuscript.


I am delighted that your wife is happy with the Mac.  I would like to point
out that *no* system/editor/formatter/whathaveyou will suit everybody.  There
is room in the field for many different approaches and/or metphors.  The
hedache with the Mac is that it is restricted to one single, enforced mode
of operation, with no allowance (save buying from someone else) that there
may be other ways of accomplishing the same ends.  The second problem here
is that not only the Mac windows-only, but Apple is doing it's best to
prevent anyone else from bringing a *compatible* window system to market.
This virtually assures that all windowing systems will be different.  Try
sitting down at anyone elses windowed system--say, Sun or Next or OS/2--
and see how much of your knowledge still works.  My wife can sit down
in front of many many systems, log in and *use* that system.  In a recent
temporary job she had she was alternately using a Sun and a PDP/11 (Yes--
as of early December Berkeley still had a 2.9 system running!) and could
work with both--with only minor adjustments needed.  When she came home,
she could use our Cadmus 9730--again, nothing different enough to cause
problems.  You are right that graphical systems should be examined, but
at this time, I think I'll wait for them to come out of the hands of the
True Believers for a while first.

      --Hal 

=========================================================================
  Hal Heydt                             |    "Hafnium plus Holmium is
  Analyst, Pacific*Bell                 |     one-point-five, I think."
  415-645-7708                          |       --Dr. Jane Robinson
  {att,bellcore,sun,ames,pyramid}!pacbell!pbhya!whh   

bruceb@microsoft.UUCP (Bruce Burger) (12/30/88)

> So I propose both,
> why cant we have a word processor that has two interfaces. A "user-friendly"
> pull down menu -- dialog based interface for new users. and a command
> oriented interface for advanced users. 

Good point.  There is no reason why having a mouse interface should make a
keyboard interface any more difficult!  

In fact, (get ready for a plug) Microsoft Word on the Mac has an 
excellent keyboard interface.  

folta@tove.umd.edu (Wayne Folta) (12/30/88)

That is the beauty of vi: you can use vi on dozens of different machines
and hundreds of different terminals without changing one finger movement
(with the exception of the escape key and the vertical bar, which find
very odd positions on some keyboards).  But this same type of "knowledge
transfer" benefits Mac users not across hardware boundaries but software
boundaries.  Once I know CMD-X cuts something, it works in every application,
from text to graphics, pictures to compilers.

And vi is NOT a word processor by any means.  It is TOTALLY line-oriented,
and has no concept at all of, say, wordwrap.  I'm a power vi user, myself
(e.g. I've used the @ macro operator since before it was documented), but
it is painful to do word processing on.  Adding nroff doesn't help.  Using
the vi/nroff combination to do word processing is like using a keypunch/batch
combination to program, it involves multiple steps and gives no immediate
feedback.  Not to mention that much of vi's power is oriented towards programs
(the %, for example), and it has a hard time even deleting a range of lines
(unless you want to count lines and use 'dd', you must leave a mark, navigate
to your intended end of deletion, then delete to the mark--slower than a
mouse).

Lastly, you do not have to take your hands off of the keyboard to do things
in Mac word processors.  Most good ones, such as MS Word, have keyboard
equivalents for anything you can do from the Mouse.  You use the mouse because
you feel like it, or because you need the help finding your command.  You
use the keyboard when you get around to memorizing the things... The best
of both worlds, which vi does not offer.


Wayne Folta          (folta@tove.umd.edu  128.8.128.42)

t-jacobs@wasatch.UUCP (Tony Jacobs) (12/30/88)

In article <4455@Portia.Stanford.EDU> rdsesq@Jessica.stanford.edu (Rob Snevely) writes:
>The issue is not ease of use, the issue is how effectively a person can
>use a program as a tool to make his/her life or job better or easier.
>The mac does have 1 advantage over emacs or wordstar, it is easier for a 
>new user to get up and running. However, once that user is up and running,
>the interface can slow down there speed and productivity. So I propose both,
>why cant we have a word processor that has two interfaces. A "user-friendly"
>pull down menu -- dialog based interface for new users. and a command
>oriented interface for advanced users. This would allow those users who
>want or need a command oriented interface access to it while allowing
>new or intermediate users to have the point and click. Also since the
>menu interface would be around all the time, it would help to eliminate
>the problems of going from on to the other cause they are interchangable.
>So if I do a lot of side by side formatting I could use the command
>interface to speed that along but I would be able to use the menu interface
>for things that I don't use all the time.
>Makes sense to me. what about you?
>
>rob

I agree providing the command interface doesn't require one to know how to
spell the commands exactly right or that the syntax is too ridgid.
I really believe that all applications should be usuable with either the mouse
or the keyboard. 
Granted drawing applications would be hard to do this way but why not let the
user decide which method he uses. For example if I wanted a circle at x=1" and
y=2" and it could be done by typing "cx1y2<cr>" it's much faster than mousing
it. Of course you can come up with examples where the mouse is faster too.

The thing that would make learning all the power user methods a lot easier 
would be to have a standard for getting help and getting advanced help inside
any application. A lot of companies have adopted cmd-? to put you into a help
mode where by you can select the item you want help on.  If all applications
supported this learning could happen faster.  If a standard for accessing
advanced help were something like cmd-shift-? it's pretty easy to remember.

APPLE REALLY SHOULD DEFINE SOME STANDARDS FOR GETTING HELP.
	

-- 
Tony Jacobs * Center for Engineering Design * U of U * t-jacobs@ced.utah.edu

zaphod@madnix.UUCP (Ron Bean) (12/30/88)

   I'd like to point out that some programs integrate command-
and menu-type interfaces in such a way that they reinforce each
other. Usually this means the menu tells you what the command
would be, and you can either click on the menu or type the
command. This makes the first use painless, yet you quickly learn
the commands you use most often. It reinforces the learning curve
rather than bypassing it. As your needs change, you can learn new
commands without digging out the manual, yet you're not stuck
behind the menus for common functions.
 
   It's not enough to allow both commands and menus without
relating them-- that just gives you the worst of both worlds. I
also don't think context-sensitive "help screens" are quite
enough-- if you only need a command occasionally you really want
to "point and shoot".
 
   Some command-driven programs (notably EMACS) allow you to
remap the commands to different keys. It would be neat if you
could remap the menus at the same time. This would not have to
change the order of nested menus, but they'd have to display and
respond to the new command key. Otherwise you find that the
author has hidden your favorite command behind some
ALT-CTRL-SHIFT sequence.
 

whh@pbhya.PacBell.COM (Wilson Heydt) (12/30/88)

In article <66512@ti-csl.CSNET], holland@m2.csc.ti.com (Fred Hollander) writes:
] In article <4510@xenna.Encore.COM> bzs@Encore.COM (Barry Shein) writes:
] >
] >The issue is not which is more "intuitive" (whatever that means) but
] 
] The issue most certainly is which is more intuitive.  At least that is the
] issue of the article to which I responded.  Since you chose to leave out the
] quote, I'll include it here to refresh your memory.
] 
] >>My son is in high school.  He also uses vi and nroff without difficulty,
] >>so please spare me the flames about difficult to learn and use.  He's been
] >>using vi since the 5th grade.
] >>
] >>I never found the Mac (or other graphical and mouse) interfaces particularly
] >>intuitive.  The command-line interface doesn't leave you guessing which button
] >to push how many times once you learn to finish commands with a carriage return
] 
] Also, since you are wondering what intuitive means, I'll include an excerpt
] from Webster's, "directly apprehended" and my own definition in this context,
] "not requiring a manual or formal training".

I'm perfectly happy to accept that definition.  Now please tell me what is
intuitive about selecting some particular button to push on a mouse and how
many times (*and* how fast) to press it?  The Mac use of the "double-click"
is *not*--I repeat *not*--intuitively obvious.  And yet--you can not use
a mac (at least as a novice) without knowing that.  Therefore--without
"manual or formal training" the Mac is unusable--at least by me.  We can
discuss what fraction of the populace shares that idiosyncracy, but until
the fraction becomes very small *some* form of help will be needed for *any*
system.

] >once learned. Or is taking a little time to learn how to use a tool a
] >dirty word?
] 
] No, it's not a dirty word.  It's just not for everyone.  Not everyone needs
] the power.  And there are people who simply don't have the time or interest
] to learn and would gladly give up some power for ease of use.  The general
] issue was "Why a metaphor?".  I'm simply making the point that a metaphor
] makes the system more intuitive.  The learning time is significantly reduced,
] an important issue for those who value their time!

This probably explains why my boss's-boss uses a Mac--and prints his e-mail
before he tries to read it.  On a more serious note, I have seen more 
people frustrated because a system or program lacked features they discovered
they needed than by those who were faced with more power than they can handle.
It is far easier to create a simplified version of a complex system than
create a complex version of a simple program.  I would suggest that the
solution is to add "novice" modes for novices and let them discover the
power that lies behind this facade.

] >There seems to be a fascination in this field with catering to some
] >mythical person with a two-digit IQ, total fear of computers, and not
] >enough technical sense to operate a push-button phone.
] >
] >My suggestion is that when you find such people don't hire them as
] >they will probably be poorly suited to the rest of the skilled white
] >collar job they are being considered for, let them find more
] >appropriate work (for both of you.)
] 
] I certainly wouldn't want you in the personnel deptartment in my company.
] Would you place less of a value on the president of a multi-million dollar
] company simply because he won't take the time to learn UNIX?  Besides, who
] says computers are only for white collar workers.  Have you heard of factory
] automation, to name just one counter-example?

In the past, it would be quite proper to hire the president of a company
in the face of a lack of technical knowledge.  This is no longer true.
The president of the company had better be able assess the value and 
accuracy of such minor things a spreadsheet outputs.  If he cannot
understand the powers and limits of computers he will not be able to
understand the economy your company is operating in.  This could
*seriously* impact *your* future.  Learn unix--not needed, per se--but
he'd better know some system or other--at least well enough to get his
mail.

    --Hal

=========================================================================
  Hal Heydt                             |    "Hafnium plus Holmium is
  Analyst, Pacific*Bell                 |     one-point-five, I think."
  415-645-7708                          |       --Dr. Jane Robinson
  {att,bellcore,sun,ames,pyramid}!pacbell!pbhya!whh   

whh@pbhya.PacBell.COM (Wilson Heydt) (12/30/88)

In article <15191@mimsy.UUCP>, folta@tove.umd.edu (Wayne Folta) writes:
> 
> 
> And vi is NOT a word processor by any means.  It is TOTALLY line-oriented,
> and has no concept at all of, say, wordwrap.  I'm a power vi user, myself
                                     ^^^^^^^^
Try "set wordwrap=" and you will find that vi understands it very nicely.
(Given this, please define "power user".)

> (e.g. I've used the @ macro operator since before it was documented), but
> it is painful to do word processing on.  Adding nroff doesn't help.  Using
> the vi/nroff combination to do word processing is like using a keypunch/batch
> combination to program, it involves multiple steps and gives no immediate
> feedback.  Not to mention that much of vi's power is oriented towards programs
> (the %, for example), and it has a hard time even deleting a range of lines
> (unless you want to count lines and use 'dd', you must leave a mark, navigate
> to your intended end of deletion, then delete to the mark--slower than a
> mouse).

I've watched fast typints use vi.  These motions are very quick.  Locating
a mouse and using doesn't seem to be as fast.

On the bvasic point--I agree.  Vi is not a wordprocessor.  It is, however
an editor.  If one happens not to *care* what the output "looks like" 
beyond knowing what nroff will do with it, it is a very effective tool.
Further--unlike many wordprocessing programs--it doesn't leave little
unwanted presents lying about in your text.

> Lastly, you do not have to take your hands off of the keyboard to do things
> in Mac word processors.  Most good ones, such as MS Word, have keyboard
> equivalents for anything you can do from the Mouse.  You use the mouse because
> you feel like it, or because you need the help finding your command.  You
> use the keyboard when you get around to memorizing the things... The best
> of both worlds, which vi does not offer.

The first effective defense I've seen.  Interestingly, you are pointing out
features of a program that came from an environment that never had a mouse
as a standard feature, rather than something ported from a mouse-driven
environment.  This makes it a one-way operation.  This could probably be
done to vi, too, should anyone want to.

     --Hal

=========================================================================
  Hal Heydt                             |    "Hafnium plus Holmium is
  Analyst, Pacific*Bell                 |     one-point-five, I think."
  415-645-7708                          |       --Dr. Jane Robinson
  {att,bellcore,sun,ames,pyramid}!pacbell!pbhya!whh   

rfarris@serene.UUCP (Rick Farris) (12/30/88)

In article <3504@geaclib.UUCP> rae@geaclib.UUCP (Reid Ellis) writes:

   Now with a keyboard, you don't want to restrict yourself to a few
   degrees of freedom [no pun intended there:-)] because the infrared
   beam has to be pointed *there*.


I think that until we build crts into our eyeglasses, that we'll
generally be pointed *there* anyway.


Rick Farris   RF Engineering  POB M  Del Mar, CA  92014   voice (619) 259-6793
rfarris@serene.cts.com     ...!uunet!serene!rfarris       serene.UUCP 259-7757

c60a-2di@e260-1c.berkeley.edu (The Cybermat Rider) (12/30/88)

In article <3504@geaclib.UUCP> rae@geaclib.UUCP (Reid Ellis) writes:
[stuff about omni-directionality (?) of RF deleted]
>Now since we have other RF devices in the home [radio
>telephones spring to mind] which also can co-exist nicely, using different
>frequencies, why not keyboards?
>
>Is it a question of cost?

I don't think so - modems have been developed that use RF to communicate.
It's more a problem of RF waves from each keyboard interfering with
neighboring ones.  For wireless modems, this problem usually doesn't rear
its ugly head simply because no one would place 2 or more pairs of wireless
modems in the same room - in fact, if you need to 2 computers within the
same room, null modem cables usually suffice.  After all, the effective
transmission range of wireless modems is pretty short.

On the other hand, you're VERY LIKELY (in an office environment) to have
DOZENS of computers sitting in one room.  You could insist that the computer
companies concerned make their keyboards "tunable", but I doubt many people
would like to fiddle around with recessed potientiometers on the bottoms of
their keyboards, trying to adjust their transmission frequencies to avoid
interefering with other keyboards in the vicinity.

I think there may be problems regarding FCC clearance as well, but I'm not
an expert in that field, so I'll leave it to those in the know to enlighten
us all further.  Suffice it to say that the problems encountered with many
RF transmitters within a small space renders this idea somewhat impractical.

Vive la cable!!  8-)

>-- 
>Reid Ellis
>geaclib!rae@geac.uucp

----------------------------------------------------------------------------
Adrian Ho a.k.a. The Cybermat Rider	  University of California, Berkeley
c60a-2di@web.berkeley.edu
Disclaimer:  Nobody takes me seriously, so is it really necessary?

cory@gloom.UUCP (Cory Kempf) (12/30/88)

In article <8299@ihlpl.ATT.COM> barth@ihlpl.UUCP (BARTH RICHARDS) writes:
>In article <1116@netxcom.UUCP> ewiles@netxcom.UUCP (Edwin Wiles) writes:
>>Yes.  Design your graphical interface to alternate rapidly between two
>>perspective images of the same object.  Interface that with a special pair
>>of glasses whos lenses are made of a rapid acting LCD material.  Set the
>>glasses so that the lenses alternate clear/dark in synch with the display.
>I know that such a system has been available for the Atari ST for at least a
>year.  I seem to remember that the needed hardware cost about $170. 

My SO brought up an interesting point on the practicality of
glasses... What if you are doing real work (as in not just playing
games), where you need to look away from the monitor to read a piece
of paper?  I have never used either of the 3D glasses ideas presented
here, but wouldn't the LCD idea interfere with normal vision?  Or is
the time that they are dark not long enough to really notice?  

Also, with the rotating polarized screen in front of the monitor for 
the other idea (polarized glasses), what happens when the screen is 
at a 45 degree angle w.r.t. the glasses?  what would be on the screen?

+C
-- 
Cory ( "...Love is like Oxygen..." ) Kempf
UUCP: encore.com!gloom!cory
	"...it's a mistake in the making."	-KT

kehr@felix.UUCP (Shirley Kehr) (12/30/88)

In article <4455@Portia.Stanford.EDU> rdsesq@Jessica.stanford.edu (Rob Snevely) writes:
<The issue is not ease of use, the issue is how effectively a person can
<use a program as a tool to make his/her life or job better or easier.
<The mac does have 1 advantage over emacs or wordstar, it is easier for a 
<new user to get up and running. However, once that user is up and running,
<the interface can slow down there speed and productivity. So I propose both,
<why cant we have a word processor that has two interfaces. A "user-friendly"
<pull down menu -- dialog based interface for new users. and a command
<oriented interface for advanced users. This would allow those users who
<want or need a command oriented interface access to it while allowing
<new or intermediate users to have the point and click. Also since the
<menu interface would be around all the time, it would help to eliminate
<the problems of going from on to the other cause they are interchangable.
<So if I do a lot of side by side formatting I could use the command
<interface to speed that along but I would be able to use the menu interface
<for things that I don't use all the time.

<From: rdsesq@Jessica.stanford.edu (Rob Snevely)
 
<Makes sense to me. what about you?
 
This is exactly what I do with Word. I rarely pull down a menu anymore. By
combining Quickeys and an extended keyboard, you can eliminate most pull
down menus. This still left you with dialog boxes to fill out, but for 
those cases where you choose the same item(s), you can use macro maker.
So, one of my macros turns hidden text on and off. Another selects the
entire document, formats it to courier font and bumps the size down one
notch. Most of this was a lot easier to do by using Word's key commands
in the macro. But in some cases I used Quickeys aliases within Apple's
macro maker.

The only problem I have now is that I'm running out of slots. Basically
all 15 function keys are programmed unshifted, shifted, and with command.
I only have the control and option keys left, but no room to write on my
little cheat sheet. I'm not defining any more commands until Word 4.0
comes out when I'll probably have to start over since I hear they've
moved items on the menus. 

Shirley Kehr

osmigo@ut-emx.UUCP (12/31/88)

[re: problems with multiple keyboards/computers using RF interfaces in ]
[     the same room.                                                   ]

I'm not so sure this would be a problem. In the rock bands that use such
devices, you see as many as half a dozen different instruments on the stage
at once, each using an RF device, each using a separate amplifier, etc., and
this problem never develops. I apologize for not being able to furnish more
specific technical information.

I really would like to see this idea pursued at greater length. Just think of
the convenience of having *all* the support hardware interface via RF: hard
disks, printers, modems, CPU's, etc. could be tucked neatly away on a bookshelf
across the room. No more piling 80 pounds of technojunk on top of your desk,
not to mention doing away with that big mess of cables. 

Ron

=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+
>  Ron Morgan      {ames, utah-cs, uunet, gatech}!cs.utexas.edu!ut-emx!osmigo  <
>  Univ. of Texas    {harvard, pyramid, sequent}!cs.utexas.edu!ut-emx!osmigo   <
>  Austin, Texas          osmigo@ut-emx.UUCP       osmigo@emx.utexas.edu       <
=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+

holland@m2.csc.ti.com (Fred Hollander) (12/31/88)

In article <22626@pbhya.PacBell.COM> whh@pbhya.PacBell.COM (Wilson Heydt) writes

>] Also, since you are wondering what intuitive means, I'll include an excerpt
>] from Webster's, "directly apprehended" and my own definition in this context,
>] "not requiring a manual or formal training".
>
>I'm perfectly happy to accept that definition.  Now please tell me what is
>intuitive about selecting some particular button to push on a mouse and how
>many times (*and* how fast) to press it?  The Mac use of the "double-click"
>is *not*--I repeat *not*--intuitively obvious.  And yet--you can not use
>a mac (at least as a novice) without knowing that.  Therefore--without
>"manual or formal training" the Mac is unusable--at least by me.  We can
>discuss what fraction of the populace shares that idiosyncracy, but until
>the fraction becomes very small *some* form of help will be needed for *any*
>system.

Perhaps you're right that there needs to be some introduction to the Macintosh
for new users.  But beyond that, there are many programs on the Mac that can
be effectively used without any manual or training.  Even more complex programs
can be used on a somewhat limited basis before referring to the manual.  I say
this from first hand experience as well as discussions with others.

>In the past, it would be quite proper to hire the president of a company
>in the face of a lack of technical knowledge.  This is no longer true.

I never said that it would be acceptable for an executive to be technically
inept.  Just that his/her time would be better spent than learning a system
that is more complex and powerful than required to fulfill their needs.

>The president of the company had better be able assess the value and
>accuracy of such minor things a spreadsheet outputs.  If he cannot
>understand the powers and limits of computers he will not be able to
>understand the economy your company is operating in.  This could
>*seriously* impact *your* future.  Learn unix--not needed, per se--but
>he'd better know some system or other--at least well enough to get his
>mail.

Agreed.  This doesn't seem to be the intent of the other poster's response.

Fred Hollander
Computer Science Center
Texas Instruments, Inc.
holland%ti-csl@csnet-rela



































The above statements are my own and not representative of Texas Instruments.

andrea@hp-sdd.HP.COM (Andrea K. Frankel) (12/31/88)

In article <267@gloom.UUCP> cory@gloom.UUCP (Cory Kempf) writes:
>Also, nobody that I have introduced to the mac has ever (willingly)
>gone back to the pc... (well, I do know ONE person who prefers the
>pc... but then, his dad has worked for Big Blue for ever, and he was
>pretty wierd too -- (Hi Shelly Blumstein!)).

I worked on a Mac and willingly went back to my PC.  Now that Microsoft
Windows is available, and packages like Micrografx Designer, on a PC
(well, actually an AT) I can have the best of both worlds.  When I'm
doing something that is better suited for a visual/mouse interface, I
do it under Windows.  When the visual/mouse interface is too slow and
unwieldy, I pop back to DOS and EMACS.  What I dislike about the MAc
is the inability to do that, to decide on a moment-by-moment basis
which kind of interface is best suited to the task at hand.

>Looking at Apple's ever increasing portion of the market v IBM, I'ld
>say that this was about the best research that your going to get.
>It's pretty conclusive... also, the people who are buying IBMs &
>clones have either not used a Mac or are constrained by price.

Price is never NOT a consideration!  Even companies with bucks to spend
want to know what the most bang per buck is.  If you have specific
software you want to run that is only available on the Mac, obviously
you're going to get a Mac.  But if you need a general purpose
cost-effective hardware platform to run a variety of software and
user interfaces, it's still hard to beat a PC-compatible machine.
Why, they even sell them at the Price Club!  (I kid you not.)


Andrea Frankel, Hewlett-Packard (San Diego Division) (619) 592-4664
                "...I brought you a paddle for your favorite canoe."
______________________________________________________________________________
UUCP     : {hplabs|nosc|hpfcla|ucsd}!hp-sdd!andrea 
Internet : andrea%hp-sdd@hp-sde.sde.hp.com 
                    (or @hplabs.hp.com, @nosc.mil, @ucsd.edu)
USnail   : 16399 W. Bernardo Drive, San Diego CA 92127-1899 USA

barmar@think.COM (Barry Margolin) (12/31/88)

In article <22626@pbhya.PacBell.COM> whh@pbhya.PacBell.COM (Wilson Heydt) writes:
>  Now please tell me what is
>intuitive about selecting some particular button to push on a mouse and how
>many times (*and* how fast) to press it?  The Mac use of the "double-click"
>is *not*--I repeat *not*--intuitively obvious.  And yet--you can not use
>a mac (at least as a novice) without knowing that.

Apple's solution to the "selecting some particular button" problem was
to use a one-button mouse.  While I personally prefer the three-button
mouse on my Lisp Machine, I suspect Apple was correct because of their
intended audience.

As for the the double-click, I've never used a Mac application that
requires the user to double-click.  It's always used as a short-cut
for some operation that can be performed in a more intuitive manner
(e.g. in the Finder it's a shortcut for the Open menu choice, in word
processors it's usually a shortcut for dragging the mouse over a whole
word).

Yes, there are some operations that can only be performed in
non-intuitive ways.  For example, paint/draw programs frequently make
the shift keys affect the way the mouse is used (such as forcing a
drag to be strictly vertical or horizontal).  And it's certainly not
obvious that you move a window by clicking in its title bar.  No one
ever said any of these environments are perfect.  However, if 90% of
the stuff one does is in a menu it means you only have to "learn" how
to do 10% of the things you need to do, instead of 100%.  This
obviously means that the menu-based system is more intuitive than the
command-based system.

It's like the "real world".  Driving a car is a relatively easy skill,
and many things are pretty intuitive (turn the wheel right and you go
right, step down harder and you go faster, "D" stands for Drive and
"R" for Reverse).  However, when I first sat behind the wheel, it
wasn't obvious to me that I needed to step on the brake before putting
the car into Drive.  I assumed that since I wasn't pressing on the
accelerator it wouldn't accelerate.  But just because there are some
non-intuitive aspects it doesn't mean that the whole system is
non-intuitive.


Barry Margolin
Thinking Machines Corp.

barmar@think.com
{uunet,harvard}!think!barmar

andrea@hp-sdd.HP.COM (Andrea K. Frankel) (12/31/88)

In article <4455@Portia.Stanford.EDU> rdsesq@Jessica.stanford.edu (Rob Snevely) writes:
>The issue is not ease of use, the issue is how effectively a person can
>use a program as a tool to make his/her life or job better or easier.
>The mac does have 1 advantage over emacs or wordstar, it is easier for a 
>new user to get up and running. However, once that user is up and running,
>the interface can slow down there speed and productivity. 

I agree entirely.  This is why I DON'T like the Mac for heavy use,
although I appreciate it for certain graphics operations.

>So I propose both,
>why cant we have a word processor that has two interfaces. A "user-friendly"
>pull down menu -- dialog based interface for new users. and a command
>oriented interface for advanced users. This would allow those users who
>want or need a command oriented interface access to it while allowing
>new or intermediate users to have the point and click. Also since the
>menu interface would be around all the time, it would help to eliminate
>the problems of going from on to the other cause they are interchangable.

Check out Windows Write (bundled with Microsoft Windows).  It does
this exactly!

Windows is designed around both the mouse and the keyboard, and the
guidelines for software developers are that anything you can do with
the mouse, you'd better be able to do without one too.  The
"accelerators" as they're called (keystroke equivalents) are labelled
right in the pull-down menus, so that you can easily learn a new
shortcut for an operation you do frequently without having to rummage
through the manual.  But it's no problem if you forget one you haven't
used in a while, because the menus are still there.  You can decide on
a moment by moment basis whether you feel like typing <alt>-f-s or
pulling down File menu and clicking Save, for instance.

Of course, Windows Write is pretty brain-damaged, and I'd only
recommend it for short memos and writing short stories at home!
I'm looking forward to the near future when I expect to see some
REAL word processors "Windowized" to act in the same way.

Andrea Frankel, Hewlett-Packard (San Diego Division) (619) 592-4664
                "...I brought you a paddle for your favorite canoe."
______________________________________________________________________________
UUCP     : {hplabs|nosc|hpfcla|ucsd}!hp-sdd!andrea 
Internet : andrea%hp-sdd@hp-sde.sde.hp.com 
                    (or @hplabs.hp.com, @nosc.mil, @ucsd.edu)
USnail   : 16399 W. Bernardo Drive, San Diego CA 92127-1899 USA

folta@tove.umd.edu (Wayne Folta) (12/31/88)

I was hoping someone would fall for my teaser about vi not having wordwrap.
I know full-well about the lame "wordwrap" setting.  But it is not true
wordwrap, it is more like an "auto-linefeed".  TRUE WORDWRAP sucks words
up onto the previous line when you delete, and fills further lines when
you insert.  Last I checked (and found wordwrap to be worthless), all
vi does is break to a new line when your cursor went past your margin setting.



Wayne Folta          (folta@tove.umd.edu  128.8.128.42)

rfarris@serene.UUCP (Rick Farris) (12/31/88)

In article <9146@ut-emx.UUCP> osmigo@emx.UUCP (Ron Morgan) writes:

> I really would like to see this idea pursued at greater length. Just
> think of the convenience of having *all* the support hardware
> interface via RF: hard disks, printers, modems, CPU's, etc. could be
> tucked neatly away on a bookshelf across the room.

A low speed keyboard interface might be feasible, but if you consider
a high-speed disk interface, at say, 1MB/sec, communicated in a
serial manner, the bandwidth needed would be around 10 MHz.  I don't
remember the precise bandwidth transmitted by TV stations, but I
suspect that you could fit 2 television signals into a 10 MHz band.

I don't think the FCC would give you the channels.

Also, even if you tied the disk drive to the system unit with wire, a
megapixel display would have the same problem.


Rick Farris   RF Engineering  POB M  Del Mar, CA  92014   voice (619) 259-6793
rfarris@serene.cts.com     ...!uunet!serene!rfarris       serene.UUCP 259-7757

c60a-2di@e260-2d.berkeley.edu (The Cybermat Rider) (12/31/88)

I just thought of another (possibly) restrictive factor with respect to
RF-transmitting keyboards: power requirements.  I assume that the keyboard
would have an on-board power source (probably a battery pack), as I don't
know of any power transmission systems whose receivers are small enough to
fit in a keyboard.  Therefore, power drain would become an important
consideration.

I know from personal experience (with IR remote controls, no less) that IR
LEDs draw just a little power, and even considering the number of characters
transmitted during an average session, I'm sure it wouldn't be too difficult
to develop a battery pack that would allow activation of an IR LED at least
10 billion times (rechargable battery packs would be even better - just set
it down in a special charging tray when you've finished).

However, I'm not so sure about power requirements of RF transmitters.  Can
anyone with practical experience make an educated guess as to the power
drain of a miniature transmitter (in the context of a wireless keyboard)?

----------------------------------------------------------------------------
Adrian Ho a.k.a. The Cybermat Rider	  University of California, Berkeley
c60a-2di@web.berkeley.edu
Disclaimer:  Nobody takes me seriously, so is it really necessary?

oster@dewey.soe.berkeley.edu (David Phillip Oster) (01/01/89)

In article <8620@alice.UUCP> debra@alice.UUCP () writes:
>There are 2 problems:
>- moving the icon for a diskette to the trash can does not eject the
>  diskette completely, so even when I put a trash can in the right position
>  the diskette does not end up in the can :-)

It is possible to place something on top of something else, but not in it.
The Macintosh hilites the destination if you drag an object so that it
will go _in_ it.

>- moving the icon for the hard disk does not eject it. when i first wanted
>  to try this several of my colleagues got very upset and prevented me from
>  trying this :-)

This is a useful and harmless command.  It does "eject" the hard disk: it
makes it inaccessible until you re-boot.  It is useful before running
software that you don't want to know about your hard disk.  (Of course,
the system won't let you do this if you running off the hard disk. You
have to boot from a floppy before you can do this (or switch-launch to a
floppy.)

The drag-floppy-to-trashcan-to-eject is an idiom that developed over time.
New users don't get told the whole story, and they don't read the manual,
so they are distressed by this.  What is really going on is a shorthand
for two separate commands:

1.) Using "Eject" from the file menu.
2.) Dragging the ghost image of the deleted floppy to the trashcan,
because  we really don't want it taking up desktop space.

The old, two step process is still available, for the nervous, but it is
no longer _required_ as it was in the earliest releases of the Finder.

Experts, when you teach the mac, teach the old way first! (Sheesh, there
is no pleasing people. The novices want strict metaphors so they will know
how things will behave, and the experts want to reduce their hand motion
for frequently used operations.)

jbn@glacier.STANFORD.EDU (John B. Nagle) (01/01/89)

In article <18655@agate.BERKELEY.EDU> c60a-2di@e260-2d.berkeley.edu (The Cybermat Rider) writes:
>I know from personal experience (with IR remote controls, no less) that IR
>LEDs draw just a little power, and even considering the number of characters
>transmitted during an average session, I'm sure it wouldn't be too difficult
>to develop a battery pack that would allow activation of an IR LED at least
>10 billion times.

      Well, let's see.  Typical LEDs draw 20-100 mA at 5v.  Let's use 25mA
as a working figure.  Assume 1200 bit/sec transmission from a keyboard, and
that each character transmitted requires 12 bits (startup, framing, redundancy,
and shifts.)  So we take 10ms per character typed.  Each keystroke thus
uses 0.25ma/sec or 7 x 10^-8 ampere-hour.  NiCd batteries run about 0.1
ampere-hour in AA size.  So 3 AA NiCd batteries should be good for about 1.4
million keystrokes, not counting any power consumption in the remainder of
the electronics.  If we assume an average keying rate of 2 keystrokes/second,
we have an operating life of 200 hours.  

      These are very optimistic assumptions.  We've ignored any power
consumption from the rest of the system, and may want to use more power
than 25mA.  If we want to use diffuse infrared, bouncing off walls,
rather than requiring line-of-sight, 500mA would be more realistic.

						John Nagle

jkrueger@daitc.daitc.mil (Jonathan Krueger) (01/01/89)

In article <851@mtfmi.att.com>, mel@mtfmi (M.HAAS) writes:
>Would it be worthwhile to investigate mapping techniques, notations,
>names, and symbols for the user interface to computer systems?

See Edward Tufte: The Visual Display of Quantitative Information.
Good displays are well understood, and by Tufte, well described.

It doesn't matter whether they're on screens or paper.

What matters is what they show us that wouldn't be as clear from other
kinds of presentation, such as verbal or tabular.

-- Jon
-- 

jkrueger@daitc.daitc.mil (Jonathan Krueger) (01/02/89)

In article <22627@pbhya.PacBell.COM>, whh@pbhya (Wilson Heydt) writes:
>In article <15191@mimsy.UUCP>, folta@tove.umd.edu (Wayne Folta) writes:
>> And vi is NOT a word processor by any means.  It is TOTALLY line-oriented,
>> and has no concept at all of, say, wordwrap.  I'm a power vi user, myself
>                                     ^^^^^^^^
>Try "set wordwrap=" and you will find that vi understands it very nicely.
>(Given this, please define "power user".)

Someone who uses "set wrapmargin=<int>" ?

-- Jon
-- 

pardo@june.cs.washington.edu (David Keppel) (01/02/89)

In article <282@gloom.UUCP> cory@gloom.UUCP (Cory Kempf) writes:
>[Polarized 3-D glasses: what do you see when you look off-screen?]

The glasses are like sunglasses in normal light.

Followups to comp.cog-eng.

	;-D on  ( Look, ma!  I'm in 3-D! )  Pardo
-- 
		    pardo@cs.washington.edu
    {rutgers,cornell,ucsd,ubc-cs,tektronix}!uw-beaver!june!pardo

bwk@mbunix.mitre.org (Barry W. Kort) (01/02/89)

In article <3173@sugar.uu.net> peter@sugar.uu.net (Peter da Silva) asks:

 > What object on the desktop are pull-down menus a metaphor for?

Have you never ordered a pizza, sub, or Chinese orgy while working
late at the office?

--Barry Kort

dav@eleazar.dartmouth.edu (William David Haas) (01/03/89)

In article <242@serene.UUCP> rfarris@serene.uu.net (Rick Farris) writes:
>In article <3504@geaclib.UUCP> rae@geaclib.UUCP (Reid Ellis) writes:
>
<   Now with a keyboard, you don't want to restrict yourself to a few
<   degrees of freedom [no pun intended there:-)] because the infrared
<   beam has to be pointed *there*.
<
<
<I think that until we build crts into our eyeglasses, that we'll
<generally be pointed *there* anyway.
<
<
I like to sit with my feet on my desk and my keyboard in my lap.  The
side of the keyboard is facing the crt and which side depends on which
direction I face.

domo@riddle.UUCP (Dominic Dunlop) (01/03/89)

In article <851@mtfmi.att.com> mel@mtfmi.att.com (M.HAAS) writes:
>Perhaps we could start off with maps...
>
>[interesting argument deleted]
>
>...  Where are the computer
>equivalents of the double white line?  the "Exit 109, Red Bank" sign?
>the "Ramp Speed 25 mph" sign?  the "No U Turn" sign?  the "MacDonalds
>8 miles at Exit 13" sign?

Hmmm.  While the maps idea is good (given enough pixels and layers on the
screen -- look at a worthwhile map and be surprised by the print quality it
requires), mention of US-specific road markings and signs makes alarm bells
ring in my head.  Difficult though it may be, can the next metophor (as
opposed, perhaps, to the NeXT metaphor) be international in its
applicability?  The desk-top metaphor, as implemented in various guises,
does pretty well around the world -- even if the Roladex is not well-known
outside the US.  (Britain has its Filofax.  What does the rest of the
world use?)  Maps, too, have the potential to be non culture-specific --
provided that the implementors keep that goal in mind.
-- 
Dominic Dunlop
domo@sphinx.co.uk  domo@riddle.uucp

anson@spray.CalComp.COM (Ed Anson) (01/03/89)

In article <4510@xenna.Encore.COM> bzs@Encore.COM (Barry Shein) writes:
>
> Or is taking a little time to learn how to use a tool a
>dirty word?
>
>There seems to be a fascination in this field with catering to some
>mythical person with a two-digit IQ, total fear of computers, and not
>enough technical sense to operate a push-button phone.

For the record: I have a three-digit IQ, I love computers, and I routinely
operate a push-button phone. And I insist that my computer use a consistent
metaphor.

Yes, I have spent a considerable amount of time learning to use my Mac as
a tool. I routinely use a dozen or so applications, switching between them
frequently. I also use an uncounted number of other applications from time
to time. Frankly, I couldn't be bothered to learn (and try to remember)
a dozen or more arcane and inconsistent command languages. I use my Mac as a
tool, not as a means to demonstrate irrelevant feats of memory. I spend my
energy learning the craft of using each application, not the mechanics of
"power" features.

I do remember a few of the most-used command key equivalents on some of
the applications I use. But I use the menus for about 90% of the operations
I perform (about 10% of the time). This is mostly because use of command
key equivalents tend to be inconsistent between applications (shame on the
developers), and I don't want to even try to remember who did what how.

It is the metaphor (desktop, or whatever) supported by a computer which helps
to make applications consistent. It is that consistency which makes it
useful for a regular user to learn a large number of applications, or for
a casual user to learn any applications at all. Yes, anybody with a moderate
intelligence can learn to use ANY one application (or two). And yes, any
major application should include "power" features for those who care to
master them. Most do. But a metaphor is still essential to integrating the
computer as a whole into our lives.

By the way, I happen to think it's about time we find a stronger metaphor.
Most of the things we do on the Mac now don't relate all that well to the
desktop metaphor. We need something that can be developed more consistently
throughout all applications. (Sorry. I don't have any suggestions just now.)

DISCLAIMER: The opinions expressed above are my own. Don't blame anyone else.
-- 
=====================================================================
   Ed Anson,    Calcomp Display Products Division,    Hudson NH 03051
   (603) 885-8712,      anson@elrond.CalComp.COM

wtm@neoucom.UUCP (Bill Mayhew) (01/04/89)

It is most likely that the word wrap feature of vi is no more than
an auto line-feed because "glass tty" terminals such as the ADM-3
had to be (still are) supported.  On an adm 3, if one deleted a
word near the top of the screen, the whole screen would have to be
refreshed in order to close up the gap created.. an annoying effect
even at 9600 baud.  Now that we have 19.2K (or more) links and
terminals with smart scrolling features, a smart word wrap would
not be so yucky for the end user.

There was also the understanding that any text would be handed off
to nroff or one of its friends which would make filled lines at
print time.  This philosopy is illustrated by the fact that much of
the old stuff in /usr/doc is not very human-readable until it is
*roff'ed.

So, in summary, I believe the designers of vi felt that the
omission of a smart word wrap was a feature rather than a bug.
Perhaps version 4.x of vi could be modernized to support true word
warpping.

-Bill

kevin@gtisqr.UUCP (Kevin Bagley) (01/04/89)

In article <4510@xenna.Encore.COM> bzs@Encore.COM (Barry Shein) writes:
>From: holland@m2.csc.ti.com (Fred Hollander)
>>Typical Mac Word Processor:
>>        Find:   word    Replace With:   new-word
>>vi:
>>        ^[:.,$s/word/new-word/g
>>Can you tell which one is more intuitive?  
 [stuff deleted]
>The issue is not which is more "intuitive" (whatever that means)
 [stuff deleted]
>Or is taking a little time to learn how to use a tool a dirty word?
 For myself and my co-workers, 'NO', but for my sister-in-law running
 a tanning-parlor whose least desire in the world is to learn programming,
 'YES!' emphatically. Especially when the power and complexity would
 be a total waste of time.

 [more stuff deleted]
>There seems to be a fascination in this field with catering to some
>mythical person with a two-digit IQ, total fear of computers, and not
>enough technical sense to operate a push-button phone.
 I think that creating a smoother and more 'intuituve' interface is
 *not* catering to a lesser IQ, but rather to a person who may have
 a higher IQ than you, but who has different priorities and responsibilities.

>Perhaps we are actually patting ourselves on the head and trying to
>convince the world how hard what we do is? Hmmm?
 I doubt it!  I use emacs (and vi) almost exclusively at work where
 this type of power is needed.  At home, I use WriteNow, which for
 many things is a true relief from emacs and vi. Even though WriteNow
 is much more intuitive, it is also more powerful for certain things.

>Much of it really isn't, I've seen many people of mean talent handle
>vi or emacs perfectly well, and spent far too many hours listening to
>boors "prove" to me that it's not possible, that holding down a
>control key is just way beyond the ability of (that loathsome
>sub-human drooling moron) the secretary.

 Perhaps the secretary has more important secretarial tasks to do
 than spend several days/weeks getting up to speed on the 'ultra-
 powerful-meta-escape-key' driven, binding-table, macro capable
 word processor. I think creating idiots out of those that don't
 use "your" favorite WP is idiotic.

>My suggestion is that when you find such people don't hire them as
>they will probably be poorly suited to the rest of the skilled white
>collar job they are being considered for, let them find more
>appropriate work (for both of you.)
 This amounts to nothing more than high-tech discrimination.
 If the person can accomplish the task with the tools at hand, be they
 a 'Maclike' word processor or 'vi', or whatever, they should be considered
 for the job. I certainly would not discriminate on the basis that a
 person had not been schooled in the almost infinite variety of word-
 processors that exist.

 I use and like EMACS. I don't feel that a person that doesn't use
 EMACS is some moron that can't dial a phone. For some things, emacs
 and vi are very weak, such as indexing, generating table-of-contents,
 red-lining, etc. etc. But that doesn't make emacs users idiots with
 less brains than a slug (washington state pet).

>	-Barry Shein, ||Encore||

 Grudgingly prepared with emacs...

magill@eniac.seas.upenn.edu.seas.upenn.edu (Operations Manager) (01/04/89)

<   It's interesting.  Thompson was always referred to as a "futurist", and he
<   figured that meant he was about 20 years ahead of everybody else.  In this
<   case, he's pretty close.
<
I and several others around here have often "speculated" upon this topic.
While not particularly germane to the DeskTop discussion, it is
interesting. We have decided that there are about 5-10, but definately less
than 25 persons, on a campus of 18K students and 18K employees who know
about and use "the network as a computer", there is a community of probably
10% faculty/staff and 50% students who actively use electronic mail, BITNET
listservers, etc. (Engineering is close to 100% for Faculty/Staff/Students,
while Hummanities usage is almost non-existant.) We hung some times on
these catagories and decided that those of us who were active network users
were about 5 years ahead of the electronic mail users, and 5-10 years ahead
of everybody else. 

I've been a member of the World Future Society for quite a few years, and
one of the things that emerges constantly is the simple fact that it takes
a minimum number of years - on the order of 3-5 from the time a new
technology has moved from the Lab to the Factory, for the production lines
to fire up and distribute it.

So if you add another 5 years for ideas to make it through the lab to the
factory floor, the total comes up pretty close to 20 years.

--
William H. Magill 			 Manager, PENNnet Operations Planning
Data Communications and Computing Services (DCCS)  University of Pennsylvania
Internet: magill@dccs.upenn.edu			  magill@eniac.seas.upenn.edu
          magill@upenn.edu 		BITnet:   magill@pennlrsm

lrbartram@watcgl.waterloo.edu (lyn bartram) (01/05/89)

In article <624@hindmost.gtisqr.UUCP> kevin@hindmost.UUCP (Kevin Bagley) writes:
>In article <4510@xenna.Encore.COM> bzs@Encore.COM (Barry Shein) writes:
>>The issue is not which is more "intuitive" (whatever that means)
> [stuff deleted]
>>Or is taking a little time to learn how to use a tool a dirty word?
> For myself and my co-workers, 'NO', but for my sister-in-law running
> a tanning-parlor whose least desire in the world is to learn programming,
> 'YES!' emphatically. Especially when the power and complexity would
> be a total waste of time.

	There is another issue here.  It is simplistic to assume that
ease of use is only related to the primary learning curve.  An interface which 
is cryptic, inconsistent and incongruent with the user's exisiting knowledge in
other areas is not only difficult to learn - it remains difficult to remember
and more importantly to predict.  The ability to infer further ways of using 
the system from its currently known procedures directly affects how easy the
system is to learn on a continual basis and to use.  This current discussion 
has centred around word
processing/text editing as the canonical interface, and while this is a well-
known case, i think there are some serious drawbacks with using it as THE
classic case.  For one, use is non-critical.  By this I mean that an error is
unlikely to be fatal (dependent on temperament, i suppose) and there is a
flexible time leeway to investigate and determine decisions on how to accomplish
the task at hand.  What about interactive systems where time (or some other
variable) is critical?  The air traffic controller may be able to afford a
long learning curve at the beginning.  HOwever, (s)he cannot afford a heavy
cognitive load on memory, prediction and interpretation when using the system,
and all decisions must be made within a fixedtime limit.  This is where
"intuitive" stops being a trendy buzzword and becomes a crucial feature.  
Studies have shown that the inconsistent interface (such as the command line 
interface of UNIX) and poor feedback (such as the lack of mode reflection
in vi) are not only related to a high learning curve but a continued error rate.
Moreover, as Draper has pointed out, the difference between the expert and the
novice in a system like UNIX is that the expert knows where to look to find out
things and the novice cannot fathom the complex paths for looking.  Prediction
 - or more properly, inference - is difficult in such a system.  The user must
infer sercondary procedures - ie, how to find out about the actual operation
desired - rather than primary procedures - how to actually do it.

>>There seems to be a fascination in this field with catering to some
>>mythical person with a two-digit IQ, total fear of computers, and not
>>enough technical sense to operate a push-button phone.
> I think that creating a smoother and more 'intuituve' interface is
> *not* catering to a lesser IQ, but rather to a person who may have
> a higher IQ than you, but who has different priorities and responsibilities.

Agreed!
What is the point of an "intuitive" interface?  To enable the user to operate
from a set of models and guidelines that do not require conscious and rational
deduction at each step.  The air traffic controller probably does not have a
two-digit IQ nor an irrational fear of computers - just a lack of 
contemplative and predictive time.  The above comment (delineated by the >> 
brackets!) is a typical example of comments from those who think that the
main need for user interfaces lies in the domain of text editing.  Actually,
much of what we can learn from these applications is more crucial in other
types of systems.  So think twice before you sneer at supposed "ease of
use" and studies in "intuitive" interface design.

wcs@skep2.ATT.COM (Bill.Stewart.[ho95c]) (01/05/89)

Peter da Silva writes:
> Just being a troublemaker...
> What object on the desktop are pull-down menus a metaphor for?

Getting the manual down off the shelf ....

I use vi instead of emacs because I learned it first and my
fingers know how to do things most emacs-users can't do
without keeping the manual around*.  But I'd rather use a Macintosh,
since I seldom have to remember anything; I can just do the
obvious and it works.  (It's especially critical since the places I use
Macs don't tend to have the manuals handy.)  I'd rather have
pop-up menus and multi-button mouse, but the Mac is close enough.

Most of the Mac-haters I know are touch-typists who don't
like moving their hands off the keyboard - the head-mouse
may be able to let them get the best of both worlds.
When I'm using the Mac for a while, I tend to do most of my
work by keystrokes, but pull-downs are still there when I need a crutch.
Wish the implementation was better and there were more keys .....

The Mac desktop isn't really enough for me - I've  gotten
spoiled by large multi-window screens with multi-tasking.
On the other hand, multiple fonts and more-or-less WYSIWYG
are such a win over 24x80 monospace that I'll happily
tolerate it as long as I'm using it as a writing/design tool
rather than a programming environment.  

---
* I'm not trying to restart Editor Wars here - this is just
  a response to the people who say menus and pull-downs are
  for people with small brains who won't read the manual.
  I'm happy to use systems that usually do what I want, so I
  can use the mental effort on the problems I'm trying to
  solve, or on the more obscure parts of the work, rather
  than wasting my time getting the typesetting to look decent.

			Bill
-- 
#				Thanks;
# Bill Stewart, AT&T Bell Labs 2G218 Holmdel NJ 201-949-0705 ho95c.att.com!wcs
#
#	News.  Don't talk to me about News.

mclauss@enlog.Wichita.NCR.COM (Mark Clauss) (01/05/89)

In article <1489@umbc3.UMD.EDU> cs374326@umbc3.UMD.EDU (Peter Johansson) writes:
>We are running a network
>of roughly one dozen Mac ][s, connected to one mac ][ with a HD as a file
>server and print spooler.  The first big problem was that the system software
>(6.0) would crash several times daily. ... 
>......., I personally found I was spending
>*more* time trying to figure things out than I was actually doing work.

Stop and think now about how much -more- work was done on the Mac by the 
people that don't understand operating systems than they would have done
at the unix prompt.

>How much research has actually gone into discovering what Joe Schmoe,
>small and medium sized business owner, wants on his desk?  Does he want
>a gas-plasma-wall-hanging-display unit and an infra-red-input-device?

I have not done any research on the subject, but I sure would like to 
see a design system that allows me to display a full schematic page 
next to the simulation traces for that page. This would require a very 
large display. I vote yes. As for small businessmen, if the system also
includes a scanning and retrieval method, I'm sure they would get into
the same situation trying to compare two documents in windows too small
to show the documents full size.

>Addmitedly, this *is* comp.sys.next, and there aren't very many end users
>(of the NeXT machine) and in the near future I see no coporate use of the
>machine planed.  .......
>I'm also curious just what percentage of the end-user computing market
>the graphical interface has captured, and what their opinions of it are.

I think the reason that the accountants and lawyers and mail order entry
people don't use graphical interfaces is primarily cost. There is a lot 
of money to be made if you can design a good sized graphics display and 
an interface to it that allows large numbers of graphics terminals to run
on one central box. The programmer will always deliver his application 
in the least expensive manner. When the application becomes too complex
to deliver on a simple text screen, it is no longer cost effective to 
do so. 

If users have never seen a graphical interface or a pointer, they will
be happy to use what is available. The current problem with most graphics
interfaces stems from the fact that they are designed to work on 
workstations. Unless a network of workstations is less expensive
than a central computer and a less capable terminal, there is no market
for the graphics. Note that expensive here refers to total cost vs.
total productivity.

The unsophistocated user wants the graphics to be transparent and reliable.
He or she wants applications that help him do his job in a cost 
effective manner. Our job as hardware and software engineers is to 
figure out how to take the things that nature gives us and help these 
people do their jobs. Sometimes that is fun, and sometimes it isn't.

-- 
Mark Clauss  Hardware Engineering, NCR Wichita <Mark.Clauss@Wichita.NCR.COM>
 NCR:654-8120                              <{uunet}!ncrlnk!
(316)636-8120         <{ece-csc,hubcap,gould,rtech}!ncrcae!ncrwic!mark.clauss>
           <{sdcsvax,cbatt,dcdwest,nosc.ARPA,ihnp4}!ncr-sd!

-- 
Mark Clauss  Hardware Engineering, NCR Wichita <Mark.Clauss@Wichita.NCR.COM>
 NCR:654-8120                              <{uunet}!ncrlnk!
(316)636-8120         <{ece-csc,hubcap,gould,rtech}!ncrcae!ncrwic!mark.clauss>
           <{sdcsvax,cbatt,dcdwest,nosc.ARPA,ihnp4}!ncr-sd!

kehr@felix.UUCP (Shirley Kehr) (01/05/89)

In article <2554@spray.CalComp.COM> anson@spray.UUCP (Ed Anson) writes:
 
<I do remember a few of the most-used command key equivalents on some of
<the applications I use. But I use the menus for about 90% of the operations
<I perform (about 10% of the time). This is mostly because use of command
<key equivalents tend to be inconsistent between applications (shame on the
<developers), and I don't want to even try to remember who did what how.

I was particularly pleased to see that Canvas 2.0 uses key commands very
much like Microsoft Word (e.g., Cmd-Shift-i for italic, b for bold, etc.
This makes it much easier, since text is a major part of my Canvas "drawings."

Shirley Kehr

norman@cogsci.ucsd.EDU (Donald A Norman-UCSD Cog Sci Dept) (01/06/89)

This discussion started off strong but has deteriorated into
name-calling and story telling.  Not clear why anyone things there is
any single answer to the question of the appropriate interaction
technique: individual differences and individual preferences are
strong and relevant, and should always be respected.

But I have to put my 2 cents in again and ask why some of you want to
leave out half of the world's population?   Certainly you wouldn't
design computer systems only for women and leave out men, so why leave
out 2 digit IQs?

The quote I am referring to is this:
 >There seems to be a fascination in this field with catering to some
 >mythical person with a two-digit IQ ...

Half the world has 2-digit IQs. Another way of putting it is that half
of the world is below average.

The IQ tests are designed so that the mean IQ is 100.

don norman

Donald A. Norman	[ danorman@ucsd.edu   BITNET: danorman@ucsd ]
Department of Cognitive Science C-015
University of California, San Diego
La Jolla, California 92093 USA

UNIX:  {gatech,rutgers,ucbvax,uunet}!ucsd!danorman
[e-mail paths often fail: please give postal address and all e-mail addresses.]

ingemar@isy.liu.se (Ingemar Ragnemalm) (01/06/89)

In article <1805@hp-sdd.HP.COM> andrea@hp-sdd.UUCP (Andrea K. Frankel) writes:
>
>In article <4455@Portia.Stanford.EDU> rdsesq@Jessica.stanford.edu (Rob Snevely) writes:
>>The issue is not ease of use, the issue is how effectively a person can
>>use a program as a tool to make his/her life or job better or easier.
>>The mac does have 1 advantage over emacs or wordstar, it is easier for a 
>>new user to get up and running. However, once that user is up and running,
>>the interface can slow down there speed and productivity. 
>
>I agree entirely.  This is why I DON'T like the Mac for heavy use,
>although I appreciate it for certain graphics operations.

I don't agree. The advantage is that a user may have a much larger collection
of programs that he uses infrequently. A program you use every day can be
non intutive and illogical and hard to learn. That is no problem once you
have learned it. The utilities that you use once a year is another matter.

Just take a look at the Mac programs around. There are a number of easy-
to-use-utilities, simple beginners programs like MacWrite, but for the
professionals, there are things like MSWord and others that has a lot
of features and requires a manual. There is a MacTex, too.
There are easy programming environments like Turbo Pascal and
Lightspeed C/Pascal, but for the more heavy-duty programming there is also
MPW, a UNIX-like environment with pipes, scripts and text interface.

This entire debate about the Mac-type interfaces is rather boring.
Why don't you people who have never used a Mac stop suggesting
improvements that have already been made? Can we please skip the "my *
is best because I use it"-garbage? (Replace * with any computer, OS,
program, car etc.)

OK, back to business.

An idea that has been mentioned (by Stephen Baumgarten) is to have
some kind of text interface apart from MPW, which is large and expensive.
I believe that there would be some points:
- The possibility to use some kind of pipes and scripts.
- The possibility to connect a VT100-style terminal to one of the Mac's
serial ports in order to allow one person to do strict text work without
taking the Mac away from the person who wants to do some layout or
drawings, and without having to buy UNIX (A/UX).
   One could do this as a DA (or application, when everybody really has
switched to MultiFinder) which puts up a little window for command lines.
   The point is that it should not be an alternative to the desktop, like MPW,
but a complement. The commands should preferrably be files with code resources
with a special type (since I guess that APPL would be inappropriate just like
for the MPW tools), and searched for through a path.
Call it reinventing MPW or writing a shell, whatever you like. It could
be useful. Maybe. What do you think?
(Suggested name: MacNeanderthal :-))


OK, back to the flames...
(FLAME ON)

>>So I propose both,
>>why cant we have a word processor that has two interfaces. A "user-friendly"
>>pull down menu -- dialog based interface for new users. and a command
>>oriented interface for advanced users. This would allow those users who
>>want or need a command oriented interface access to it while allowing
>>new or intermediate users to have the point and click. Also since the
>>menu interface would be around all the time, it would help to eliminate
>>the problems of going from on to the other cause they are interchangable.
>
>Check out Windows Write (bundled with Microsoft Windows).  It does
>this exactly!
>
>Windows is designed around both the mouse and the keyboard, and the
>guidelines for software developers are that anything you can do with
>the mouse, you'd better be able to do without one too.  The
>"accelerators" as they're called (keystroke equivalents) are labelled
>right in the pull-down menus, so that you can easily learn a new
>shortcut for an operation you do frequently without having to rummage
>through the manual.  But it's no problem if you forget one you haven't
>used in a while, because the menus are still there.  You can decide on
>a moment by moment basis whether you feel like typing <alt>-f-s or
>pulling down File menu and clicking Save, for instance.
>
>Of course, Windows Write is pretty brain-damaged, and I'd only
>recommend it for short memos and writing short stories at home!
>I'm looking forward to the near future when I expect to see some
>REAL word processors "Windowized" to act in the same way.

>Andrea Frankel, Hewlett-Packard (San Diego Division) (619) 592-4664

This is ridicolous! Just replace "Windows Write" with "MacWrite" and
"Windows" with "Mac Toolbox" and every word is true, except for the
last sentence - the real word processors arrived long ago! Are you sure this
article isn't a mutation of some Mac article from -84?

In particular, a Mac program that is supposed to be used regularly
should *never* force the user to walk through a lot of menus and/or
dialogs for anything but the most unfrequent operations! To me, the menus
are a kind of short on-line manuals.

In my work, I use Mac and Sun, and SunTools doesn't seem to use key
equivalents as a standard. (It is nice in other respects, though.)


/Ingemar Ragnemalm
-- 
Dept. of Electrical Engineering	     ...!uunet!mcvax!enea!rainier!ingemar
                  ..
University of Linkoping, Sweden	     ingemar@isy.liu.se

ralphw@ius3.ius.cs.cmu.edu (Ralph Hyre) (01/07/89)

In article <27265@ucbvax.BERKELEY.EDU> oster@dewey.soe.berkeley.edu.UUCP (David Phillip Oster) writes:
>She also composes, using a music-score processor. she uses the mouse in
>one hand to place notes on staves, and the keyboard under the other to
>select which kind of note the mouse will leave.  
Why not use a real piano-style keyboard for this?  It's hard to get more
natural, unless you have no piano experience.  or a MIDI guitar, or
anything MIDI interfaceable that you can easily attach to most
computers.  For editing, it should be sufficient to point at the
note with the mouse and change it.
-- 
					- Ralph W. Hyre, Jr.
Internet: ralphw@{ius{3,2,1}.,}cs.cmu.edu    Phone:(412) CMU-BUGS
Amateur Packet Radio: N3FGW@W2XO, or c/o W3VC, CMU Radio Club, Pittsburgh, PA
"You can do what you want with my computer, but leave me alone!8-)"
-- 

dlm@cuuxb.ATT.COM (Dennis L. Mumaugh) (01/07/89)

In article <4362@pitt.UUCP> bonar@pitt.UUCP (Dr. Jeffrey Bonar) writes:
>
>I have an invitation for net readers - create a metaphor for computing 
>systems that goes beyond the desktop cliche.  Four years ago, Apple 
>had something with the Macintosh desktop: a new way to think about 
>computing.  Now, everyone is copying the desktop: Microsoft, IBM, 
>AT&T.  Even the new NeXT machine provides little more than a 
>desktop with some cute simulated depth.
>

I suggest all people who are involved with Information Technology be
required to read the following article before being allowed to
post netnews:

%A Vannevar Bush
%T As We May Think
%J Altantic Monthly
%D August 21945
%X This article described an information
handling workstation of the future, at which a user could sit and browse
information which would appear on rear-projection screens;  links between
places in different documents would connect related information, and the
machine would be able to switch over to those related documents if they
were stored on the system (a concept today called "hypertext").
.br
This article also describes the concept of intertextual links {margin
notes} that became part of the documents and allow establishing
correlations and cross-references.  It also posited the concept of an
information space and a world-wide data space.

My point is that the above citation seems to be unknown to serious
researchers.  It describes a set of concepts that have not yet been
achieved.   A vague glimmering was attempted by Doug Englebert at SRI
in his Augmented Knowledge Workshop.   I feel that people need to
re-examine a lot of the past as I seem to see people keep re-inventing
old ideas that have become forgotten.

Of course, one should remember that had not Alfred Einstein been
around Vannevar Bush would have been the most famous scientist in the
USA.   It isn't surprising that his idea was so seminal.

-- 
=Dennis L. Mumaugh
 Lisle, IL       ...!{att,lll-crg}!cuuxb!dlm  OR cuuxb!dlm@arpa.att.com

dlm@cuuxb.ATT.COM (Dennis L. Mumaugh) (01/07/89)

In article <4362@pitt.UUCP> bonar@pitt.UUCP (Dr. Jeffrey Bonar) writes:
>
>I have an invitation for net readers - create a metaphor for computing 
>systems that goes beyond the desktop cliche.  Four years ago, Apple 
>had something with the Macintosh desktop: a new way to think about 
>computing.  Now, everyone is copying the desktop: Microsoft, IBM, 
>AT&T.  Even the new NeXT machine provides little more than a 
>desktop with some cute simulated depth.
>
>Marshall McLuhan said that a new medium always began by 
>imitating the old medium: cow paths were paved to make roads for 
>the "horseless carriage", film began by putting a camera in front of a 
>play, and finally, computer screens now look like a desktop.  What if 
>we really let go into our new medium; what should a computer work 
>space really look like?
>
One of these years I hope to meet up with some one who has read some
old fashioned Science Fiction!!!

%A Arthur Clarke
%T Imperial Earth
%X Novel about a delegate to the Tri-centennial of the US
Independence.   Plot surrounds relationship with an old chum who
is billiant and unstable.  Major plot element is the portable,
personal "computer" which is a lifelong companion, secretary,
notebook,
filing cabinet and general reference library.  Said object when
attached to the local equivalent "telephone" with ISDN and a global
access becomes one's entry to the world.

Check the book out.  That book along with Vannevar Bush's article
(see previous post) descibe a potential that makes cyberpunk look
sick. 
-- 
=Dennis L. Mumaugh
 Lisle, IL       ...!{att,lll-crg}!cuuxb!dlm  OR cuuxb!dlm@arpa.att.com

dykimber@phoenix.Princeton.EDU (Daniel Yaron Kimberg) (01/07/89)

In article <2350@cuuxb.ATT.COM> dlm@cuuxb.UUCP (Dennis L. Mumaugh) writes:
>[description of an article from 1945]
>My point is that the above citation seems to be unknown to serious
>researchers.  It describes a set of concepts that have not yet been
>achieved.   A vague glimmering was attempted by Doug Englebert at SRI
>in his Augmented Knowledge Workshop.   I feel that people need to
>re-examine a lot of the past as I seem to see people keep re-inventing
>old ideas that have become forgotten.

No one's re-inventing the wheel, but hand waving is hand waving.  The
real advances are going to be in actual systems.  You seem to be assuming
that the problem is a shortage of ideas, and complaining that we should be
spending more time seeing what's already out there.  Well, the real shortage
is in things like technology, funding, resources, and time.  Just because
current hand waving bears a striking resemblance to past hand waving, it
doesn't mean your re-invention alarm has to go off.  Most work from
1945 is probably so extrapolative as to make it worthless.  Who in 1945
could have predicted which vision of the future would seem right 40+ years
later?  The fact that one guy seems to have gotten it right is irrelevant.
We don't want to have to constantly search through all the chaff of
the past n years for the gems.
    On the other hand, similar papers published today (I have a few references
if anyone is interested), while proposing very similar ideas, are of a
better grade of hand waving, since their ideas are actually technologically
feasible.  (Of course, Technologically is only one species of Feasible.)  And
I certainly wouldn't expect people to grind through today's ideas forty years
from now to see what they can find.  If they still want cyberspace or office
metaphors in forty years, good for them, but it won't be because someone looked
up some forty year old articles, it'll be because the idea was good enough to
be continually re-invented until someone had the bright idea of doing something
about it.  If you're worried about the original idea-man not getting credit
for his good extrapolation, well that's life, and besides, there's nothing new
under the sun anyhow, right?

                                               -Dan

p.s. i haven't read the article in question, i am responding only to the
     ideas expressed in the message posted

meissner@xyzzy.UUCP (Michael Meissner) (01/08/89)

In article <18640@agate.BERKELEY.EDU> c60a-2di@e260-1c.berkeley.edu (The
Cybermat Rider) writes:

| On the other hand, you're VERY LIKELY (in an office environment) to have
| DOZENS of computers sitting in one room.  You could insist that the computer
| companies concerned make their keyboards "tunable", but I doubt many people
| would like to fiddle around with recessed potientiometers on the bottoms of
| their keyboards, trying to adjust their transmission frequencies to avoid
| interefering with other keyboards in the vicinity.

I guess I don't understand this comment at all.  Think of as an
ethernet -- each device in ethernet has a unique 48 bit address
(addresses are sold in 2**24 bit chunks to companies, by a global
network number czar -- IEEE I think).  When a device (ie, keyboard or
host), communicates, it puts it's serial number in the packet sent
off, and if there is a collision, the packet is resent.  Above that,
there are various means to broadcast, and to map logical network ID's
to the physical hardware ID's.  Like in ethernet, each device would
contain some PROM that gives this unique address.  Various techniques
have been worked out to do real networking via radio (see the Aloha
network references, and also Phil Karn's work with TCP/IP over ham
radio).

| I think there may be problems regarding FCC clearance as well, but I'm not
| an expert in that field, so I'll leave it to those in the know to enlighten
| us all further.  Suffice it to say that the problems encountered with many
| RF transmitters within a small space renders this idea somewhat impractical.

Part of the radio spectrum is reserved for RF devices in an
unregulated fashion and is used for cordless phones and such.
-- 
Michael Meissner, Data General.

Uucp:	...!mcnc!rti!xyzzy!meissner
Arpa:	meissner@dg-rtp.DG.COM   (or) meissner%dg-rtp.DG.COM@relay.cs.net

janssen@titan.sw.mcc.com (Bill Janssen) (01/08/89)

>%A Vannevar Bush
>%T As We May Think
>%J Altantic Monthly
>%D August 2 1945

More easily found in:

Irene Grief (ed.), COMPUTER-SUPPORTED COOPERATIVE WORK:  A BOOK OF READINGS,
Morgan-Kaufman, CA, 1988.

along with all kinds of other good stuff.

Bill

jbn@glacier.STANFORD.EDU (John B. Nagle) (01/09/89)

In article <2643@xyzzy.UUCP> meissner@xyzzy.UUCP (Michael Meissner) writes:
>Part of the radio spectrum is reserved for RF devices in an
>unregulated fashion and is used for cordless phones and such.

      Cordless telephones require FCC type approval.  See
47 CFR 15.231 - 15.237.

      Low-power communications devices in general are regulated under
47 CFR 15, subpart D.  There is a band reserved for low-power non-voice
devices between 26.99MHz and 27.26MHz.  RF bandwidth is restricted to
20KHz, and power to 10,000uV/m at 3 meters.  Model R/C gear used to use
this band, but that activity has mostly been moved to the 72MHz and 75MHz area.
Some of the cheaper radio-controlled toys still use the 26MHz band.

      It appears, suprisingly, that type approval is NOT required in
this band.  So it is a candidate for within-room wireless data transmission.

      Other users use this band, so interference is to be expected.  However,
there are six channels in the band, and the regulations allow operation on
"one or more" of them.  So it would make sense to have a system which
changes frequency when interference is encountered.  Sort of like a
frequency-hopping ALOHANET.

      This could all be made to work, but packing it into a keyboard
might require some semicustom ICs.

      Bear in mind that the channels in this band are rather narrow.
Wireless keyboards, yes.  Printers, maybe.  Graphic displays, no.  Disks, 
no way.

					John Nagle


					John Nagle

kaufman@polya.Stanford.EDU (Marc T. Kaufman) (01/09/89)

In article <17988@glacier.STANFORD.EDU> jbn@glacier.UUCP (John B. Nagle) writes:

>      It appears, suprisingly, that type approval is NOT required in
>this band.  So it is a candidate for within-room wireless data transmission.
 ...
>      This could all be made to work, but packing it into a keyboard
>might require some semicustom ICs.

And I thought wireless baby monitors were bad...   This would be a wonderful
security hole in a system. (let's see... a brief burst of keyboard data from
a hopped-up 100 watt CB -> 'rm -rf *.*')

Marc Kaufman (kaufman@polya.stanford.edu)

maujt@warwick.ac.uk (Richard J Cox) (01/10/89)

In article <390@skep2.ATT.COM> wcs@skep2.ATT.COM (Bill.Stewart.[ho95c]) writes:
...
>Most of the Mac-haters I know are touch-typists who don't
>like moving their hands off the keyboard - the head-mouse
>may be able to let them get the best of both worlds.
...

I'm different: I'm a Mac hater, I'm not a touch typest, and I don't mind
moving my fingers off the keyboard.
The reason I hate the Mac is its user interface - it forces you to conform
too much, ie it is not customisable enough. When I use UNIX I use tcsh
(csh+command line editing via emacs style CTRL chars) with >40 aliases,
highly customised emacs etc....

- RC

/*--------------------------------------------------------------------------*/
JANET:  maujt@uk.ac.warwick.cu     BITNET:  maujt%uk.ac.warwick.cu@UKACRL
ARPA:   maujt@cu.warwick.ac.uk	   UUCP:    maujt%cu.warwick.ac.uk@ukc.uucp
Richard Cox, 84 St. Georges Rd, Coventry, CV1 2DL; UK PHONE: (0203) 520995

uucibg@sw1e.UUCP (3929] Brian Gilstrap) (01/11/89)

In article <77@poppy.warwick.ac.uk> maujt@warwick.ac.uk (Richard J Cox) writes:
>
>I'm different: I'm a Mac hater, I'm not a touch typest, and I don't mind
>moving my fingers off the keyboard.
>The reason I hate the Mac is its user interface - it forces you to conform
>too much, ie it is not customisable enough. When I use UNIX I use tcsh
>(csh+command line editing via emacs style CTRL chars) with >40 aliases,
>highly customised emacs etc....
>

Hmmm....I certainly won't argue that the standard Finder-style interface does
not provide customization.  I also won't argue that customization can be nice
(being a unix programmer, I strongly agree).  However, the Mac has many options
for customization, including several general macro packages such as QuicKeys,
Tempo II, and Apple's MacroMaker.  Many of these packages let you make your
macros apply to all programs or just a particular program.  Also, many of
programs out now allow you to create macros within the scope of the program.
You might want to take a second look at the Mac, if customization is your real
complaint.

Of course, the die-hard Mac-ites (I'm not one) might argue the appropriateness
of using such macros.  Personally, I think extensibility is the key to a long
software product lifetime, so in that sense I agree with you.  However, I own
a MacII and I'm generaly quite happy with the applications, so in that sense
I guess I have to disagree with you.

By the way, this seems to have digressed, so I've directed follow-ups to
comp.misc for lack of a better choice ( the ppl in comp.sys.mac[.programmer]
have already "got the religion" so we'd get into ego-bashing if we moved
there :-)

Disclaimer: I'm not affliated with Apple or any company that creates or markets
Macintosh software.  I am a generally satisfied customer, though I'll be more
satisfied when I've got Unix on my machine. :-)

>- RC
>
>/*--------------------------------------------------------------------------*/
>JANET:  maujt@uk.ac.warwick.cu     BITNET:  maujt%uk.ac.warwick.cu@UKACRL
>ARPA:   maujt@cu.warwick.ac.uk	   UUCP:    maujt%cu.warwick.ac.uk@ukc.uucp
>Richard Cox, 84 St. Georges Rd, Coventry, CV1 2DL; UK PHONE: (0203) 520995

Brian R. Gilstrap
One Bell Center Rm 17-G-4                  ...!ames!killer!texbell!sw1e!uucibg
St. Louis, MO 63101                        ...!bellcore!texbell!sw1e!uucibg
(314) 235-3929