[sci.virtual-worlds] RPI Advanced Technology Group

cyberoid@milton.u.washington.edu (Robert Jacobson) (10/26/90)

[David Reed, a frequent contributor to The WELL's vr conference, 
and a principal in RPI, kindly consented to having me crosspost the 
following announcement from his firm to sci.virtual-worlds.  You 
can send queries to David at rpi@well.sf.ca.us.  I will deliver 
comments posted here to David at The WELL.  -- Bob Jacobson, 
Moderator]
 
RPI Advanced Technology Group,
General Project Information.
 
(c) 1990, All Rights Reserved-
 
RPI Advanced Technology Group is a research facility developing 
the finest hardware and software solutions in Synthetic Digital 
Environments (SDE's) for visualization, Design and Analysis related 
applications. Our systems provide both remote real-time and 
synthetic-digital input simultaneously. RPI was the first to produce a 
fully-integrated SDE system for general commercial use and has 
been producing custom systems solutions for over 20 years.
 
Our systems feature over 30 hardware, software, methodology and 
pricing advantages over any other similar solutions. Our entry-level 
system is the Cyberchair (tm) electronic furniture for absolute and 
synthetic environments. The Cyberchair can interface to your DOS or 
UNIX workstation and offers many hardware and software utilities 
to customize the system to user requirements.
 
This electronic furniture/processor is a stand-alone component of the 
Reality Port (tm) absolute- and synthetic-environment processor, a 
computer-based system that lets the user experience a synthesized 
environment as though it were actually before them, and within that 
environment to manipulate objects that would require tremendous 
effort and cost to manipulate in the real world. The user wears 
ordinary work clothes, and commands the system to simulate an 
environment so convincing that he or she can see, feel, hear, touch 
and smell what is there as if it were real.
 
Aerospace engineers can simulate flight tests during which they can 
clearly see air-flow patterns, and, either using their hands or 
entering voice commands into the computer, manipulate the shape of 
the aircraft to alter those patterns.
 
The system can simulate the trauma ward of a big-city hospital or a 
remote rural clinic for the purpose of training doctors in 
emergency-surgical procedures or actually performing those 
procedures in miniature. It can put civil engineers beside a bridge, 
collapse the bridge during an earthquake, and let the engineers 
rebuild the bridge as it is crumbling before them, until it ceases to 
fail in the tremors. Manufacturers of complex products will save 
millions of dollars by using the Reality Port (tm) environment 
synthesizer as a testing ground; doctors and other scientists will 
probe the physical world in startling new ways, and environmental 
planners and developers will experience and be able to demonstrate 
the results of their visions before sinking public and private funds 
into designs that may contain costly errors.
 
RPI has developed its product by integrating components that 
produce computer-generated ultra-realistic sense stimuli with real-
time images, sound and other sense-data. Working together, these 
data convince the senses that they are experiencing a "real world" 
scene. The primary system technology is proprietary to RPI and is 
protected under U. S. Patent filings.
 
The company's teaming agreements bring to the product the world 
leaders in visualization software: Wavefront Technologies, Inc., as 
well as the high-end fiber optic technology of Schott Fiber Optics, 
Inc., the extraordinary advances in computer-graphics hardware by 
Silicon Graphics, Inc., the sensing technology of Ascension 
Technology; and the Position utility of EXOS, Inc ; to mention only 
a few.
 
RPI has created a system that, it feels, is at least four generations 
more advanced than anything discussed at the recent National 
Computer Graphics Association convention or in any design 
announced so far.  The company states that this technology and 
methodology is different from "Virtual Reality" experiments; this is 
a high-end, commercial, total-immersion environment synthesis 
system".
 
Independent consultant, Roger Wilson, best known for his technical 
direction on computer graphics for Universal Studios, TRON and 
European Broadcasting remarks that "RPI certainly has approached 
this the right way, they seem to have the resources to make this fly".
 
The Reality Port (tm) synthetic-environment processor and it's 
components, including the Cyberchair (tm) electronic furniture, have 
been in development over 18 years at RPI, which has been in 
business since 1970, specializing in  customized projects for clients in 
the Fortune-500 group and government agencies.
 
The system employs the science of biostereometrics--the multi-
dimensional study of the body and technical means of enhancing its 
use of the senses via man/machine integration.
 
The system incorporates Silicon Graphics Powervision (tm) super-
graphics computers and RPI's Cascade Software (tm) system control 
programs into an array of visualization, sonic and other sensory 
technologies. These result in full-field-of-vision, full-color stereo-
optics in resolution ranges over 3000 lines integratedwith full-force-
feedback transformative touch, and geometric digital sonics.
 
A prototype of each module of the entire system has been tested in 
full operation. A far-high-end corporate demonstration and 
development Chamber is being established in Northern California 
for client services, sales demonstrations and environment archive 
mastering, pending third round financing. Multiple configurations, 
including two stand-alone components, are available for a wide 
variety of applications. Over 40 hardware support accessories are 
also available. Systems arecurrently made to order. Although entry-
level system prices run from the five to seven figure range, RPI 
plans to announce details on their release of the Personal Simulator 
(tm) projected to be priced in the high-end consumer range.
 
TRADEMARKS HELD BY THE PROJECT:
 
The Reality PortTM absolute and synthetic environment systems.
 
The Reality ChamberTM low or non-intrusive environment reality 
synthesizer modules.
 
The CybercycleTM digital environmental vehicle synthesizer.
 
The MedportTM SDE medical surgery system.
 
The Farpoint Verity VisionTM real-time environmental scanner 
units.
 
The SpiritphoneTM three dimensional sonic headset.
 
The SteelcloudTM audio-visual and telemetry input/output headpiece.
 
The PhasegridTM arrays of non-intrusive motion, position and 
activity electronic sensing devices.
 
The ShapewallTM series of full force-feedback surface simulator 
systems.
 
The DirectvisionTM series of optical data input/output devices. 
Featuring 8 different electro-optical methods, 1280 x 1024 
resolution and higher, color, stereoscopy with combined 
biostereometric processing.
 
Reality ApparelTM interactive programmers processing body-sensor 
webs.
 
The CyberchairTM electronic furniture for interactive absolute and 
synthetic environment systems.
 
The Personal Environment GeneratorTM work-station mini-systems.
 
Cascade, Synthetic and Absolute Environment Integration 
ProgramTM software system.
 
Custom high-end complete immersion systems and applications, from 
work station to auditorium mode. High resolution visuals. Third 
generation real-time biostereometrics. Spherical digital sound. 
Thrust technology tactiles. Hardware and software utility exotics. 
PowerVision  1,000,000 vectors/polys/sec. Mass Archiving. 
Exclusive Integration Software.
 
Protected by U. S. Trademark and Patent filings. (c) 1990 RPI. All 
Rights Reserved. PowerVision is a trademark of Silicon Graphics, 
Inc.
 
A few of The Reality PortTM System Exclusives and Advantages:
(c)1990 RPI, All Rights Reserved.
 
* Reality PortTM was the first fully-integralized, total-immersion, 
low- and non-intrusive system, of its kind, available in the world. It 
is software, hardware and an operating methodology that has never 
existed in this complexity or this standard of technical achievement.
 
* Visuals use RGB imaging and not the lower quality NTSC process.
 
* Visuals are 1280 x 1024 resolution and higher, not the current 
standard of 500 x 500, low resolution. This provides crystal-clear, 
you-are-there, images. 8 technically different DirectvisionTM units 
are employed by the system as options.
 
* A single system outputs stereoscopy as opposed to the dual-
computer system visual outputs currently in use today.
 
* There are multiple modes and models of the system to chose from. 
Extensive custom configuration options are available.
 
* Our exclusive Cascade ProgrammingTM provides the first 
efficient relationship between all sensory interactive elements, input 
and output, to all primary biological senses.
 
* The ShapewallTM system of interactive automated tactile surfaces 
provides the full-force feedback of actual surface structure through 
thrust-technology integration.
 
* The Reality ChamberTM provides defined parameter structuring 
between the Anchor System and the User Environment.
 
* The Reality PortTM is the first system of it's kind to provide both 
Absolute (real-time remote) and Synthetic Environments in a 
concurrent experience.
 
* Our exclusive PhasegridTM Array allows one to use our high-end 
systems without wearing heavy, cumbersome, sweaty, gloves, and 
suits with huge cables.  In the Reality PortTM you are more free to 
experience your designs, applications and procedures than in the 
"real" world.
 
* The stand-alone CyberchairTM electronic workstation unit and the 
CybercycleTM Digital transportation syntheziser can operate 
independent of the Reality ChamberTM.
 
* The system includes Silicon Graphics PowerVisionTM, scent input, 
spherical digital sound, touch, vibration, user axis distortion, alpha 
wave processing, high-frequency photonic retinal triggering, voice 
command and much more...
 
For Information:
RPI Advanced Technology Group
POB 14607
San Francisco, CA, USA 94114

robertj@uunet.UU.NET (Young Rob Jellinghaus) (10/29/90)

If this RPI work is as amazing as it sounds, why haven't I heard of
any of it before now?  Why isn't RPI's lab as well-known as VPL or
Autodesk in the VR game?  Saying "it's not VR, it's an environment
simulator" puzzles me, because these tools _sound_ like they push the
edge of VR work anywhere.

Has this work produced any RPI technical reports thst I might be able
to achieve through university-research channels?  Are more detailed
system specs available?  What is the resolution of your input devices?
How quick is the feedback of your tactile modules?  Is your communication
protocol publicly available or licensable, and can outside developers
construct (software or hardware) modules which can supplement your system?

Hoping to learn more,

-- 
Rob Jellinghaus                 | "Next time you see a lie being spread or
Autodesk, Inc.                  |  a bad decision being made out of sheer
robertj@Autodesk.COM            |  ignorance, pause, and think of hypertext."
{decwrl,uunet}!autodesk!robertj |    -- K. Eric Drexler, _Engines of Creation_