[comp.society.futures] replacing the desktop metaphor

bzs@Encore.COM (Barry Shein) (12/25/88)

This past week I attended a talk by Scott Fisher of NASA/AMES hosted
by the Boston Computer Society entitled "Artificial Reality".

He is working on a system which uses a helmet with stereoscopic
displays and head-motion sensors and data gloves (gloves you put on to
interact with what you are viewing in the helmet.)

As you turn your head the scene changes to match. You can use the
gloves for several types of input, grabbing things in the environment
and moving them about (one of the video tapes showed someone grabbing
menus and rearranging them in space), a simple sign language to do
things like change your perspective (a fist with the index finger
motioning upward started you moving up above the "room"), etc. The
gloves can also be visually replaced with a robot arm or any other
object so it corresponds with your motions.

One of the goals is to help design new control environments for NASA's
spacestation, rather than be confronted with a visual field of meters
and dials etc on a typical control panel operators would be able to
put a helmet on and set up a much more flexible, consistent
environment in, essentially, an empty room. Similar applications were
mentioned to help with EVA's (ie. space walks.)

Other applications by external groups included surgical assistance
(eg. views to help match up previously designed surgical goals during
an actual operation, bone-grafts were mentioned), three-dimensional
art work (sort of a three-dimensional paint program), being able to
interact with simulations (there was one tape of a shuttle exhaust
simulation you could step inside of or even become one of the
particles) and of course recreational applications (computer games, he
mentioned the possibilities for adventure-style games.)

My thought was wow, finally a solution to two people arguing about the
color schemes in the house, they can each have their own!

A version of the data glove is currently available from the same
company which has been working with the NASA/AMES group (I didn't
catch the name.) The helmet is still under devpt but Scott Fisher
assured the audience that there is a great deal of commercial
interest, he indicated there are still some details to be worked out
to make this viable (I believe he still has trouble generating the
graphics quickly enough to keep them in sync with all motions, the
result of falling behind is usually a bad case of motion sickness.)

Graphics were all wire-frame for now.

The gentleman who introduced Scott (sorry, forgot his name, probably
the president of BCS or some such person) made a funny remark about
wondering if one day he'll wake up and realize that he'd been wearing
an artificial reality helmet all his life...

	-Barry Shein, ||Encore||

cjosta@taux01.UUCP (Jonathan Sweedler) (12/25/88)

In article <4479@xenna.Encore.COM> bzs@Encore.COM (Barry Shein) writes:
>
>This past week I attended a talk by Scott Fisher of NASA/AMES hosted
>by the Boston Computer Society entitled "Artificial Reality".
>
>He is working on a system which uses a helmet with stereoscopic
>displays and head-motion sensors and data gloves (gloves you put on to
>interact with what you are viewing in the helmet.)

The September 1988 Issue of Byte has an article on this input/output
system being designed at NASA.  By the way, the glove is called
DataGlove and is designed by VPL Research.  They also sell a product
called DataSuit.  This is an extension of DataGlove and it consists of
sensors in a suit that covers the user's whole body.  I won't go into
more detail.  You can just read the Byte article.  It's called "Between
Man and Machine."

In the same magazine, there is also a whole section on display
technology.  In the article "Face to Face" a 3D display system that is
being developed at TI is described.   The system is based on a rotating
disk, but this disk rotates at speeds of less than 10 Hz, so I don't
think it would have the low pitch humming noise that I imagine the BBN
Spacegraph has.  This system is described as a "real-time,
auto-stereoscopic, multiplanar three-dimensional display system." In
the prototype, the image can be viewed from any angle, and it has a
resolution of 500 by 500 pixels with a 4 inch depth of field.  Again,
see the article for more details.
-- 
Jonathan Sweedler  ===  National Semiconductor Israel
UUCP:    ...!{amdahl,hplabs,decwrl}!nsc!taux01!cjosta
Domain:  cjosta@taux01.nsc.com

meo@stiatl.UUCP (Miles O'Neal) (12/27/88)

In article <4479@xenna.Encore.COM> bzs@Encore.COM (Barry Shein) writes:
>As you turn your head the scene changes to match. You can use the
>gloves for several types of input, grabbing things in the environment
>and moving them about (one of the video tapes showed someone grabbing
>menus and rearranging them in space), a simple sign language to do
>things like change your perspective (a fist with the index finger
>motioning upward started you moving up above the "room"), etc. The
>gloves can also be visually replaced with a robot arm or any other
>object so it corresponds with your motions.
>
>A version of the data glove is currently available from the same
>company which has been working with the NASA/AMES group (I didn't
>catch the name.) The helmet is still under devpt but Scott Fisher
...
>Graphics were all wire-frame for now.

"The Glove", as it is known around here, was at SIGGRAPH this year
in Atlanta. It was easily the neatest thing at the show, from a new
development standpoint. The hand's location in 3-d space is detected
either acoustically or magnetically (I forget which), but the neat
thing is the hand motion; fiber optic pipes sans outer shield run along
the wrist and fingers; the variable impedance of each pipe as it bends
with the hand determines the hand motion. An on-screen hand followed
The Glove perfectly. The wearer could grab the objects on-screen by
moving the hand appropriately, and manipulate them. These were simple
graphics, but were 3-d, shaded sloid object, not wire frame, and they
moved in real-time, tracking the hand motion wonderfully!

-Miles O'Neal
Sales Technologies, Inc.
gatech!stiatl!meo

dm@bbn.com (Dave Mankins) (12/28/88)

I went on a tour of the MIT Media Lab recently.  They have a Dataglove
attached to an HP-something-or-other.  After a bit of training the HP
to recognize your finger-position for grasping, you could manipulate some
Platonic solids placed on a grid.  

I tried juggling two of the solids.  Since the HP was bogged down
rendering polygons, and since objects would pass through your hand
unless the HP noticed that you were in ``grasp-position'', it was like
juggling on the moon after several pints of beer with hands made of
ectoplasm.

Cyberspace, here I come!
-- 
david mankins/dm@bbn.com