[sci.virtual-worlds] Description of Virtual Worlds work at UNC

hlr@well.sf.ca.us (Howard Rheingold) (02/09/90)

I'd like to respond to the requests made in this newgroup
for people working in the field of virtual worlds to describe
their work.

I'm Warren Robinett, project manager of the Head-Mounted
Display project in the Computer Science Department at
the University of North Carolina at Chapel Hill (UNC). My
virtual worlds genealogy traces back to NASA Ames where
I wrote much of the software for the Virtual Environment
Workstation (the NASA head-mounted display) in 1986 and 87.

The work in virtual worlds here at UNC is focussed on
building a usable and useful head-mounted display system
with three principal applications in mind:

1. Visualizing the 3D shapes of complex protein and drug
molecules for the purpose of helping working biochemists
design drugs and understand biochemical processes.
Professor Fred Brooks has for 20 years been working with
biochemists pursuing the use of computer graphics
and force feedback to help them solve their problems.
Professor Dave Richardson of the Duke Biochemistry Department
is spending his sabbatical year working here with us.
At this point we are able with our HMD to display
a CPK (colored spheres) model of a protein with several
hundred atoms and also a drug molecule of about 50 atoms.
The HMD user can grab the drug with a hand-held manipulator
and attempt to "dock" the drug into the molecule.

2. X-ray vision for doctors using a head-mounted display
and a sensing device such as an ultrasound transducer. We have
a very strong group here doing research in 3D medical image
display led by Professors Henry Fuchs and Steve Pizer,
again with more than two decade's worth of experience. We
have strong ties with the UNC Radiology and Radiation Oncology
Departments, which gives us access to real anatomical
image data and real clinical problems. Two goals we are
working towards are real-time volume rendering
and real-time reconstruction of a 3D image from
the data produced by a 2D ultrasound scanner.
We hope that in a year or so we will be able to feed the
images acquired by the real-time sensors, properly
transformed to the HMD wearer's viewpoint, and achieve
the subjective feeling of seeing inside living tissue.
We intend to use a "see-through" HMD for this application
(a HMD that lets the user see the outside world and
superimposes the computer graphics). Currently, Jim Chung is
exploring using the HMD in a medical application: planning the
3D placement of the beams used to irradiate tumors
in cancer treatment.

3. Architectural Walk-through. Once the computer model
of a building has been created (a tedious job), a HMD can
be used to walk around inside of the building before
any construction has taken place. We can currently do
this with buildings consisting of several thousand
polygons.


The hardware we are using here at UNC is:

1. Two models of HMD headgear: a VPL EyePhone and another
somewhat similar HMD using LCD TVs and a bicycle helmet
built by a team at the Air Force Institute of Technology
under Major Phil Amburn.

2. We use the Polhemus 3Space magnetic tracker
for tracking the user's head and hand, as do all other
current virtual worlds systems that I am aware of.
Because of inherent problems with the Polhemus, especially
lag and range, we also have another tracker under development --
an optical tracker which will allow a large room-sized
working environment.

3. Our main computer graphics engine is the Pixel-Planes 4
processor-per-pixel graphics system, designed by 
Professors Henry Fuchs and John Poulton. It can render
around a thousand arbitratily large shaded
polygons or spheres in 1/30 second.
Its successor, PixelPlanes 5, which we expect to come
up in the next 6 months, will gives us a 20-fold increase
in real-time graphics image complexity.

4. Our principal manual input device is a hand-held manipulator
(a billiard ball) with switches mounted on it
and a Polhemus position-and-orientation sensor mounted
inside of it. We also have several other input devices:
joysticks, a Spaceball, and a VPL DataGlove.

5. We are using a Macintosh computer as a dedicated
sound server to produce digitized sounds in synchronization
with events in the virtual world.

6. We have a working 6-degree-of-freedom (i.e both linear
force and torque) force-feedback system, the Argonne
Remote Manipulator (ARM). The ARM has been used to
help biochemists by giving force-feedback in addition
to computer graphics in drug-docking tasks. It has not
yet been used with the HMD. We intend to integrate the
ARM with the HMD in the next few months.

This hardware gives us the ability to synthesize virtual
worlds that are in color, are seen stereoscopically
through the head-mounted display, and are of moderate
image complexity. Monophonic sounds can accompany events
in the virtual world (or be events in their own right
with no accompanying graphics). When the ARM is integrated,
the user will be able to feel forces and torques in
the virtual world (for example, bumping into objects)
as mediated by the hand-grip on the ARM.


I've written more than I intended, so I'll stop here.
But one last comment.

Here at UNC, we don't look upon the head-mounted display
merely as a kind of "electronic LSD," as the article last
week in the Wall Street Journal described artificial
reality. We think we can use the HMD to do a better job
of solving real problems where visualization of complex 3D
information is crucial to solving the problem.
 
My personal view is that computer graphics in general,
and the head-mounted display in particular, can allow
people to see (and therefore understand) things that
would otherwise be invisible (and therefore not
understood or even noticed). I call this expansion
of human perception. Fred Brooks calls it intelligence
amplification. Some people call it scientific 
visualization.

Now don't get me wrong. I have spent a large fraction
of my working life designing computer games, and I
think that the HMD has wonderful possibilities for
entertainment, too. We'll be having a little fun with
that application here at UNC. One of my pet
projects is a simulation of the solar system and nearby
stars, naturally including a spaceship to travel
through it. Warp Factor 9, Mr Sulu.