rick@eos.arc.nasa.gov (Rick Jacoby) (11/27/90)
Another lurker responds. I work in the VIEW (Virtual Interactive Environment Workstaton) lab at NASA Ames Research Center. The VIVED project was started in 1984 by Mike McGreevy, and the VIEW project was an off-shoot of VIVED. When I joined the project in 1987, Scott Fisher was running the VIEW project. Some of the hardware development, and almost all of the software development and systems integration have been done by Sterling Software, a contractor to NASA Ames. Under Scott Fisher's direction, the VIEW project developed a general-purpose, multi-sensory, personal simulator and telepresence device. The initial operating configuration was completed last spring and it included head and hand tracking (Polhemus), wide field-of-view stereo head-mounted displays (LEEP Optics & monochrome LCDs), 3-D audio output (Crystal Rivers Convolvotron), glove for gesture recognition (VPL DataGlove), a boom-mounted CRT display (Sterling Software), and a remote camera platform (Fake Space). Some of the initial scenarios developed for VIEW were the teleoperation of a (virtual) puma arm robot, an astronaut EVA scenario, a virtual theramin, and fluid flow visualization. The lab's current direction is to focus on a few specific research issues. One project is to connect the VIEW environment to a real Puma arm and study the effect of improved force-torque displays on teleoperations. Another project, perhaps the most interesting to VR enthusiasts, is a joint study between the VIEW lab and the computational fluid dynamics group at Ames. The project will study human interaction with various virtual control devices. A third project simply uses the VIEW environment as a simulator to study the amount of fuel that would be used by a 'lost' astronaut trying to fly back to the space station. Our host computer is an HP 9000/835 and the graphics can be processed on either a Canadian-made graphics computer (ISG Technologies) or the HP's SRX graphics boxes. We have an 18-page Macintosh document which is an overview of our hardware sub-systems and software conventions and paradigms. I will e-mail a text-only version of this document to readers who e-mail me a request for it before the end of January. The VIEW lab is currently under an umbrella group called the Spatial Displays and Instruments Labs. Two other labs in this group are Spatial Audio Display lab and Virtual Planetary Exploration (VPE) lab. The Audio lab, run by Beth Wenzel, is developing digital signal-processing techniques and the technology platform required for three-dimensional auditory displays. The work is based on psychoacoustic principles of auditory localization and there is ongoing supporting research aimed at perceptually validating and improving the display. The realtime hardware, called the Convolvotron, was developed with Scott Foster of Crystal River Engineering. The VPE lab, run by Mike McGreevy, is investigating ways to help planetary geologists remotely analyze the surface of a planet. They are applying virtual reality techniques to "virtually explore" planetary terrains. Their host computer is a Stardent GS2000, and they are using VPL Eyephones. Their data are height fields derived from Viking images of Mars, and a typical scene will contain tens of thousands of polygons. Rick Jacoby rick@eos.arc.nasa.gov Sterling Software NASA Ames Research Center