[comp.society.futures] Input devices for the computer of the future

PJS@GROUCH.JPL.NASA.GOV (Peter Scott) (01/13/89)

Okay, time to kick in my $0.02 worth.  Dreaming a little here... I'm not
going as far as human-quality continuous speech recognition because I
believe that that will happen only as a result of radical new technology
using a biological model, but everything here is just an extrapolation
(an *extreme* extrapolation!) of current capabilities.

The computer is portable - about the size of a Walkman.  This is mainly
so you have something big enough to get hold of and not lose, rather
than because it needs to be that big.  It has a slot for insertion of
memory modules (flopticals, maybe), about the size of a poker chip,
again, that *big* mostly so you can read the labels and don't lose them.

The computer's attached by two wires to a pair of glasses and a pair of
gloves.  The glasses are wrap-around (say, parabolic or spherical arc)
LCD-derivative technology that can display an opaque or semi-transparent
image, or be completely transparent when the computer has nothing to
display.  Naturally they are 2k * 2k * 24 bit minimum resolution or
equivalent (color, in case you hadn't guessed).  The gloves contain
pressure sensors and transducers.

Now pick your favorite metaphor; since there is a separate input for
each eye, you have stereo and therefore 3-D; a compass on the glasses
can record changes in head movement to make the metaphor complete.
If you want to type, the computer can supply an image of a display
screen and a keyboard which you `feel' with the gloves (could easily
be an ancient Underwood creating output on an illuminated parchment,
if it's made by the same team that did the Macintosh you can bet there'd
be a choice :-)).  3-D CAD becomes a snap; use one hand to manipulate
objects in 3-D (already available but not coupled with stereoscopy yet)
and the other to change modes (flip scales, gravity on/off, all the sort
of stuff that happens in the menu bar).

How about digitizing?  With a little creative technology, when the
glasses are in transparent mode they can be engineered to record the
HLS value of the light they're transmitting, so all the user needs to
do to digitize an object is to look at it, draw a rough outline with
his/her finger, call an edge-detection routine, do fine cleanup with a
pixel editor.

Oops, guess I forgot the earplugs.  Stereo sound, so that if you `pick up'
an object, and drop it, it can make a noise.  A useful form of positive
feedback.  Can also be used to alert you to the presence of objects out
of view.

Now imagine what kinds of games you can play with this sucker.

(Whole thing runs for 6 months off a `D' cell and costs under $1k :-))

Actually, I've gotten so excited about this thing while writing this I
can hardly wait to get mine.  Now, we put in a cellular antenna for
networking...

Any criticism as to the infeasibility of this machine will be met with 
the standard retort that all things are possible with nanotechnology,
which seems to be a standard assumption until we discover otherwise...

Peter Scott (pjs%grouch@jpl-mil.jpl.nasa.gov)

snoopy@sopwith.UUCP (Snoopy) (01/15/89)

In article <890112093223.000003A6171@grouch.JPL.NASA.GOV> PJS@GROUCH.JPL.NASA.GOV (Peter Scott) writes:

|The computer's attached by two wires to a pair of glasses and a pair of
|gloves.  The glasses are wrap-around (say, parabolic or spherical arc)
|LCD-derivative technology that can display an opaque or semi-transparent
|image, or be completely transparent when the computer has nothing to
|display.  Naturally they are 2k * 2k * 24 bit minimum resolution or
|equivalent (color, in case you hadn't guessed).  The gloves contain
|pressure sensors and transducers.

Everyone seems to think that active eyewear will be all the rage.  Perhaps
someone could explain how you keep from causing mega-eyestrain?  Reading
conventional books or by-now-more-or-less-conventional CRTs causes way too
much eyestrain as it is.
    _____     
   /_____\    Snoopy
  /_______\   
    |___|     tektronix!tekecs!sopwith!snoopy
    |___|     sun!nosun!illian!sopwith!snoopy

garye@hpdsla.HP.COM (Gary Ericson) (01/17/89)

Having a display in a pair of glasses (assuming you don't already wear
glasses) would certainly solve the physical space problem of having a large
display.  One definite drawback, though, is that two people can't look at
the same display simultaneously.  Maybe the solution there would be to
network his computer to yours, putting a window of your display on his
glasses along with a pointer you control so you can point things out on the
display.

Considering a company  already makes something like this (Reflection 
Technologies' "Private Eye": mono, black&white), this is really not
far-fetched.  I think it would be important, as was mentioned, to do some 
studies on eye strain before anyone gets too carried away with this idea.

Gary Ericson - Hewlett-Packard, Workstation Technology Division
               phone: (408)746-5098  mailstop: 101N  email: gary@hpdsla9.hp.com

caasi@sdsu.UUCP (Richard Caasi) (01/20/89)

Well, while we're on the subject of man machine interfaces, why not just
wire some electrodes directly into the brain and dispense with the infinite
variations of input devices.  This would solve the eyestrain problem, the
machine code to ascii/ebcdic mapping, the typing problem, eliminate voice
input issues, etc., etc.  After all the brain processes more information
that the average speed reader could absorb.  Hust think of it, the
ultimate form of communication - the direct transfer of thought with no
ambiguities (or typos); in short, complete understanding; no need for the
discrete symbol-processing system we call human language to constrain the
analog world of thinking.  Maybe we could neural network front ends to
powerful computers to hook up to.        ^ use

Richard Caasi
San Diego State University

hjespers@attcan.UUCP (Hans Jespersen) (01/20/89)

Prosthetic devices are getting more and more advanced every day. Consider
this, a "virtual keyboard" designed for use by people without functional
hands or fingers. Electric pulses which propagate along the nerves to the
hands and fingers (which perhaps don't exist) are picked up by electronic 
sensors and a small microprocessor translates these hypothetical motions
into an alphanumeric character or perhaps a machine instruction. Incredible
input speeds could be achieved. The hardest part might be training the user
to "type" the right key. Different interfaces could be designed for 
completing different tasks. People could be trained to single handedly (no
pun intended) control complicated equipment that normally would require
several operators. Just imagine a quadrapeligic (sp?) space shuttle
pilot who can control all aspects of the complicated equipment that (s)he
it "wired" into, a job that currently takes several men and numerous 
computers to perform. Ahh, dreams.


---------------------------------------------------------------------------
Hans Jespersen                UUCP: uunet!attcan!hjespers
AT&T Canada Inc.                or     ..!attcan!nebulus!arakis!hans
Toronto, Ontario              #include <std.disclaimer>

U1DF1@WVNVM.WVNET.EDU ("John Neubert") (01/21/89)

>Well, while we're on the subject of man machine interfaces, why not just
>wire some electrodes directly into the brain and dispense with the infinite
>variations of input devices.  This would solve the eyestrain problem, the
>machine code to ascii/ebcdic mapping, the typing problem, eliminate voice
>input issues, etc., etc.  After all the brain processes more information
>that the average speed reader could absorb.  Hust think of it, the
>ultimate form of communication - the direct transfer of thought with no
>ambiguities (or typos); in short, complete understanding; no need for the
>discrete symbol-processing system we call human language to constrain the
>analog world of thinking.  Maybe we could neural network front ends to
>powerful computers to hook up to.        ^ use

>Richard Caasi
>San Diego State University

This is not a crazy idea... and I didn't see any smilies.  I was a film
major way back when, pre-CS days.  An alternative cinema book I was
studying had a final chapter in which the author postulated the ultimate
creative cinema recorder -- brain hook up so one could record one's
thoughts or dreams.  The idea pervades much sf -- how about Spock's
brain connection?

The problems are legion (e.g., our thoughts are often random and seemingly
confused as one jumps from sometimes seemingly unrelated thoughts to
others -- or backtracks).  Language, written and spoken, seems to
slow our thinking process down enough to make sense between people.
Then again, perhaps I'm wrong on that.  Remember the computers in
The Forbin Project.  Once they agreed on the language, they "talked/
thoght" so fast between them, humans couldn't keep up.  Perhaps
two intellects could converse so fast at "thought speed" that they
could do much more than normally -- or they would get so frustrated
trying to follow the other's thought processes they'd be at each
other's throats.

Interesting concept anyway.  Micro-sensors, micro-processors, and
nanotechnology may even make it possible.

This has all been off the top of my head at "thought speed", so
take it at that.

elm@ernie.Berkeley.EDU (ethan miller) (01/21/89)

In article <8901201826.AA08787@multimax.encore.com> U1DF1@WVNVM.WVNET.EDU ("John Neubert") writes:
#[stuff about direct computer-brain interface deleted]
#This is not a crazy idea... and I didn't see any smilies.  I was a film
#major way back when, pre-CS days.  An alternative cinema book I was
#studying had a final chapter in which the author postulated the ultimate
#creative cinema recorder -- brain hook up so one could record one's
#thoughts or dreams.  The idea pervades much sf -- how about Spock's
#brain connection?

Two sf books I will recommend that deal with this subject are _Neuromancer_,
by William Gibson (there are sequels, but I haven't read them) and _The
Genesis Machine_, by James P. Hogan.  The first deals with man-machine
interface to a worldwide computer network, and much of the information
is delivered by "converting" it to sight and sound, etc.  Of course, that's
just the way it was written down; perhaps the protagonist's brain was
working in some way which we can't understand.  Maybe it's like trying
to explain "green" to someone who has been blind from birth.

The other book uses a more conventional approach--the computer is hooked
up to the brain directly, and visualized problems are picked up by the
computer and solved (incidentally, this is not the main focus of the
book, but it is mentioned a few times).  For example, if you want to
solve a finite-element analysis on a bridge, you would visualize a bridge
and provide relevant information like materials and dimensions.  One
problem brought up was that, given today's computers, there would be
problems "imagining" correctly.  The computer still followed orders
literally and provided GIGO.  Perhaps we should do more work in AI
and "do what I mean, not what I say" computers before hooking them
directly to the brain (of course, there's the minor problem of
_designing_ a brain-computer interface, but that exercise is left
to the reader :-).

ethan
*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*+*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*
ethan miller (UCB CS grad student)         | bandersnatch@ernie.berkeley.edu   
"Quod erat demonstrandum, baby." -T. Dolby | {...}!ucbvax!ernie!bandersnatch

garye@hpdsla.HP.COM (Gary Ericson) (02/16/89)

[Apologies to the net.  I tried responding to Gordon by email, but attempts
 failed.]

In an article on "Input devices for the computer of the future" I mentioned the
"Private Eye" device which uses a small eyepiece to display a full screen to
the user.  Gordon Booman asked me for the address, and my reply follows:

> In article <400010@hpdsla.HP.COM> you write:
> >...
> >Considering a company  already makes something like this (Reflection 
> >Technologies' "Private Eye": mono, black&white)...

> Do you have an address or any info on how I can contact them?
> Thanks,
> -- 
> Gordon Booman  SSP/V3   Philips TDS Apeldoorn, The Netherlands   +31 55 432785
> domain: gordon@idca.tds.philips.nl             uucp:  ...!mcvax!philapd!gordon

I have an article from a magazine (I forget which) that doesn't list the
address, but it does say they're in Waltham, Mass., so I called directory
assistance out there and got the following address (I figured I could call
Mass. directory assistance easier than you could from The Netherlands 8^):

	Reflection Technologies, Inc.
	240 Bear Hill Road
	Waltham, MA  02254

	phone: (617) 890-5905

I haven't tried this phone number or address, but I assume it's correct.

Gary Ericson - Hewlett-Packard, Workstation Technology Division
               phone: (408)746-5098  mailstop: 101N  email: gary@hpdsla9.hp.com