[comp.graphics] Virtual reality

simstim@milton.acs.washington.edu (simstim) (10/27/89)

I am attempting to locate information on the subject of Virtual Reality.
If you know of any publications, articals, ftp archives, etc., please
Email me direct. I will summarize the responses.

Thank you for your assistence.

						- Steve

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
"There are two major products that come out of Berkeley: LSD and UNIX. We
 don't believe this to be a coincidence." ||   - Jeremy S. Anderson 

#include <disclaimer.h>                   simstim@milton.u.washington.edu
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

mjb@nucleus.UUCP (Mark Bobak) (11/08/89)

In article <3600003@hpindda.HP.COM> eli@hpindda.HP.COM (Eli Lauris) writes:
>/ hpindda:comp.graphics / simstim@milton.acs.washington.edu (simstim) / 10:14 pm  Oct 26, 1989 /
>
>>I am attempting to locate information on the subject of Virtual Reality.
>                                                         ^^^^^^^^^^^^^^^
>Seems to me a contradiction in terms. Anyway, what does it have to do with
>graphics ???


well, this is some info I put together on a local BBS and was planning on
posting here soon anyhow.  Here goes:

On Thursday, November 2, 1989, I attended Autofact '89 at Cobo Hall.  Autofact
is a CIM new product show.  Among the usual run-of-the-mill type stuff I saw,
The Autodesk Virtual Reality Project really got my attention.  This is a
system whereby you can interact with a 3D model in 3D.  You are part of the
model.  Let me explain.  You start out with a 3D wireframe of your model which
is then solid modeled.  The solid model is imported into "cyberspace."  Now the
fun starts.  You put on a head-mounted display (it resembles a scuba diving
mask) which contains 2 LCD TV screens, one for each eye, which project a stereo
image of your model.  You can actually see the model in 3D like a 3D movie. 
Also, you put on the "DataGlove" which can detect the position of each finger
of your hand and has a sensor to determine the relative location of your hand
in space.  The computer also projects a "model" of your hand on the display
which dynamically changes as you move your hand.  Using different hand motions,
you can interact with the model.  There is also a sensor on the top of the
head-mounted display which determines the position of your head.  You can "fly"
through space by pointing your finger in the direction you want to go, you can
"grab" objects by intersecting them and making a fist.  As you "travel" through
space, the display will also dynamically change as you look around.  Once
you've grabbed an object, you can examine it in detail by just turning it in
your hand as if you had picked a book off the shelf.  To erase something,
simply grab it and throw it away.  The object will tumble through space and
disappear, never to be seen again.  A few basic editing functions are also
available.  You can scale an object up or down in size, copy an object and move
an object.  When you complete an operation, the cyberspace system automatically
feeds the modifications back to the AutoCAD wireframe.  A couple of examples
in the demo were (the demo used a mock up model of a car radio):  copy one of
the radio knobs, scale it to 26x original size and you can actually "enter" the
knob an look at it's internal structure, or, enter the radio through the dial,
and examine the location of the chips inside.  I had an opportunity to try
this out myself.  It was *incredible*!  Everyone ought to get a chance at
playing in cyberspace!  The demo they ran me through was an architectural
model.  A house, I started in the den, "flew" up to the bookshelf to check
out the selection, went out to the pool to get the view of the world from
the bottom of the pool (everything had a blue tint), etc, etc.  It was
unbelievable!  I heartily recommend it to anyone who gets a chance to try it
out!!  I've included a copy of the 1-page info sheet that I got on this
thing.  It follows here:




                    The Autodesk Virtual Reality Project

                        The Autodesk Cyberspace Team
                              October 24, 1989
B

The Autodesk "virtual reality" or "cyberspace" project is pioneering a way of
interacting with computers.  In cyberspace, users interact with models and
data as though they were real, creating a more natural and intuitive
environment for computer applications.  It is the next big step in changing
the face of computing as the user sees it.

In the 1960's, with the advent of the text-based video display terminal (VDT),
all computer applications shared a similar "user interface" that was often
characterized by confusing screens full of text.  In the 1970's, Xerox PARC
took advantage of the emerging technology of graphics-based computer displays
and created what is today the most widely used (or sought-after) user
interface: the so-called desktop metaphor.  Today, Autodesk is working
toward the next giant leap - toward a 3-dimensional, highly interactive user
interface that will forever change the way we work with and think about
computers.

Using a head-mounted display, special position sensors, and high-speed
graphics accelerators combined with software developed at Autodesk, a
cyberspace user is immersed in a computer-generated 3D world directly under
his or her own control.  The user (or "patron") can fly through space in any
direction and orientation, while simultaneously being able to turn his or
her head in any direction and have the view properly presented in the head-
mounted display.  Using the DataGlove (from VPL Research, Inc.), the cyberspace
patron can give commands to the system using gestures and by "grabbing" items
on a heads-up-display-style menu system.  Objects in the model can be
selected for editing by pointing at the object using a virtual laser beam
pointer.  Once selected, objects can be moved, stretched, rotated, or have
other geometric transformations performed on them dynamically.  Once changed,
the object's data can be automatically transferred (via a network connection)
to AutoCAD, updating the model in it's database.  This facility is being
expanded to include access to the wide variety of AutoCAD commands for
constructing and editing 3D computer models.

At Autodesk, we are exploring the use of this new technology for several
applications in the short term.  Cyberspace interaction will likely prove to
be the most useful way of working with 3D computer models of real-world (or
imagined) systems, and will greatly benefit CAD users in applications such as
architecture, plant design, layout, mechanical assembly, and animation.  As
faster and lower cost hardware (head-mounted displays, sensors, graphics
display engines, etc.) become available, we will be in a perfect position to
exploit their new capabilities.  In addition, the user interaction work done
for the cyberspace project is equally applicable to systems that use mode
conventional methods of display and input, like high-speed graphics work-
stations using mice and digitizers.  Although head-mounted displays offer the
unmistakable advantage of actually "being there", other systems (stereo
displays using LCD glasses, etc.) have certain benefits as well, including
cost effectiveness and greater short-term acceptability.  The system of
interaction developed by the Autodesk cyberspace team is hardware independent
and easily adapted to there alternative technologies.

In the long term, virtual reality technology may dramatically change the way
we work with computers and even the way we work.  Among the many developments
being explored by the cyberspace team at Autodesk is the use of the system by
multiple patrons as a communications medium.  Given enough bandwidth between
remote patrons, it is conceivable that the technology will enable multiple
users to interact as though they were physically in the same room.  In the
long term, it is quite likely that the technology will supplant commuting. 
Imagine going to work in the Bahamas every day through your cyberspace
optical fiber link at home.  We believe that this is not only possible but
probable in the future.

How distant is that future?  A hard question to answer.  One of the by-
products of the Autodesk virtual reality project is to stimulate the commercial
development of the hardware needed for the future by creating applications in
the short term.  Autodesk plans to have a product that uses this technology
available in the next year.  It is out intent that this early development
work will both spawn an industry and establish a new standard for ease of use
in working with 3D computer generated models.






Copyright 1989 Autodesk, Inc.
Autodesk, Inc.
2320 Marinship Way
Sausalito, California 94965


Copied without permission

-- 
Mark Bobak
The Nucleus Public Access Unix, Clarkston, Mi.
mjb@nucleus.mi.org
mjb@m-net.ann-arbor.mi.us

rfh@unssun.unscs.unr.edu (J.A. MacDonald) (11/11/89)

I missed the posting, but simstim@milton.acs.washington.edu (simstim) wrote

> I am attempting to locate information on the subject of Virtual Reality.


   Funny you ask this now. I just this morning read an article in PC Computing,
the latest issue I think, regarding Virtual Reality (or Artificial Reality).
That's one place to locate information. mjb@nucleus.UUCP (Mark Bobak)
also posted a description of it. If I may be so bold, I'd like to add some
of my own to that.

   Artificial reality is a field being developed by NASA as well as several
firms in the Bay Area (AutoDesk and VPL Research). Originally the idea was
developed by a fellow named Krueger (? I'm going from memory). It involves
entering a 3D space defined in a computer's memory by means of two interface
devices: the DataGlove (built by VPL) and the EyePhone (also by VPL). NASA
uses it's own version of the EyePhone. The DataGlove uses two fibreoptic 
lines running down the back of each finger and the thumb to measure the
angle of bend in two joints (the joint at the hand and the middle joint;
sorry, my anatomical terms are pathetic :-). On the back of the glove is
a device (name escapes me, starts with a p) that consists of three wire
coils positioned orthogonaly to each other. Somewhere else in the room is
a similar (but larger) setup. This setup allows for the tracking of the
attitude of the glove in space.

   The EyePhone is simply a device that looks like a scuba mask (or at
least it does in the pictures in the article) with two little colour 
CRT's in it, one for each eye. This provides for stereo viewing.

   Once equipped, enter the world of the computer's memory. VPL set up
a demo of a day care centre. Two people can enter this demo at one time.
In it you move about the room by pointing or walking the fingers in a 
direction and there you go. You can pick up objects, move them around, etc.
The AutoDesk setup is used with AutoCAD generated information for designing
stuff. See Bobak's previous posting. At NASA they plan to use it to tour
places via information obtained using probes.

   This technology sounds to me like the next form of man/machine interface.
Designing a home comes to the point of picking up walls and putting them in
place. Let's try a window here. No? pick it up and either toss it or move
it somewhere else. Picture hiring a builder/designer to build you a home.
Several days later you "tour" your new home with him, suggesting changes
which he implements right before your eyes! Eventually you have your dream
house and he knows exactly what you want. Zap a point on the menu: up pops
an approximate cost, schedule, etc. Zap another menu and out goes the plans
to the plotter. Remove the DataGlove and the EyePhone and, voila, there're
the plans waiting to be built.

   Or how about this: Let's build a space station. The whole thing can first
be done in a computer's memory. I.e. a training simulation. Then launch the
materials and a bunch of robots. Each robot has two cameras (eyes) and two
fairly dextrous hands (or maybe more), plus thrusters to move about. Down
on earth are a bunch of "construction workers" wearing DataGloves and
EyePhones manipulating them. "What do I do next?". Pop up the simulation
data: wow, hey, deja vu. Ever find you wished you had more hands to do a
job ("Honey, could you hold this for a minute?")? Use voice commands to 
change control of the glove to another arm.

   
WHAT I WANT TO KNOW IS HOW DO I GET TO TRY THIS STUFF OUT?
                       ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~


J.A. MacDonald


disclaimer unnecessary -- unemployed grad student soon to be AVAILABLE
==========================================================================
rfh@unssun.unscs.unr.edu     |  AAAARRRGGHH!!!!!!!  - it's only a dream...
University of Nevada - Reno  |  WHERE'S MY AXE??!!  - it's only a dream...
==========================================================================

sjs@spectral.ctt.bellcore.com (Stan Switzer) (11/17/89)

In article <2058@bacchus.dec.com> klee@decwrl.dec.com writes:
> The idea is to extend the "direct manipulation" style of user interface
> to use simulated physical objects and simulated operations on those
> objects.   Using a data glove to touch/move/throw a ball drawn on your
> screen is one example.  Another is using a treadmill device to walk
> around a virtual room.  You can probably think of others.

About 8 years ago at the University of Kansas, the Biology Dept had an
apparatus to study the influence of pheromones on our favorite urban
insect pest.  The device was a large sphere mounted on bearings and
turned by perpendicularly mounted drive wheels.  The subject was
placed on top of the device and monitored with photodetectors so that
as it moved off-center, the sphere was moved to bring him back to the
top.  The "progress" of the nasty beast was collected in a file for
statistical analysis to see whether he was moving toward the bogus
seductress or was oblivious to the ersatz odor.

It'd be rather difficult to reproduce at the human scale, but it is
interesting to note that cockroaches may be the first beings to
"enjoy" the fruits of Artificial Reality.

Reach out and (artificaly) touch (a surrogate representation of)
somebody...

Stan Switzer  sjs@bellcore.com

macs%worke@Sun.COM (Manuel A. Cisneros) (11/17/89)

In article <378@opus.NMSU.EDU> rfh@unssun.unscs.unr.edu (J.A. MacDonald) writes:
>
>   Or how about this: Let's build a space station. The whole thing can first
>be done in a computer's memory. I.e. a training simulation. Then launch the
>materials and a bunch of robots. Each robot has two cameras (eyes) and two
>fairly dextrous hands (or maybe more), plus thrusters to move about. Down
>on earth are a bunch of "construction workers" wearing DataGloves and
>EyePhones manipulating them. "What do I do next?". Pop up the simulation
>data: wow, hey, deja vu. Ever find you wished you had more hands to do a
>job ("Honey, could you hold this for a minute?")? Use voice commands to 
>change control of the glove to another arm.
>

This is really an extension of the concept of Waldos, its just using
a digital medium for feedback rather than a mechanical one.  Neat stuff!

Manuel.

Chris_F_Chiesa@cup.portal.com (11/20/89)

I've been following with interest the discussion of Virtual Reality -- it
sounds like something I've been waiting for ever since I started reading
science fiction as a child!  In a message whose header Portal neglected to
include, rfh@unssun.unscs.unr.edu (J.A. MacDonald) waxed enthusiastic about 
the potential of V.R. technology, then asked:
 
>    
> WHAT I WANT TO KNOW IS HOW DO I GET TO TRY THIS STUFF OUT?
>                        ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

I want to add a heartfelt "yeah!  Me TOO!" -- AND the additional request for
names of "contact persons" at places working with this, to whom resumes might
be sent by potential employees...!  (E-mail please; there are certain difficul-
ties in reading the Net from here...)

Thanks in advance,

   Chris Chiesa
     full of application ideas for VR!

zahid@gec-rl-hrc.co.uk (Dr. Zahid Hussain (G11)) (10/15/90)

dear all,
   does any one have a bibliography on virtual reality. I just need
something to get me started. Many thanks,   **Zahid.

P.S. applogies in advance if I'm wasting band-width by posting to a wrong 
group.