[sci.virtual-worlds] Telepresence

almquist@brahms.udel.edu (Squish) (04/17/91)

I received a few responses from my last posting about biocybernetics 
SO, figured it was time to post another article from my collection.
This one is "Telepresence", Marvin Minsky - 1980.

Mr. Minsky points out that if one REALLY wants to get an impression
of what teleprence is and what it could be used for, one should
read Heinlein's book "Waldo", 1948.  This book is about a crippled
youth who invents numerous gadgets so that he may remotely interact
with the outside world - ie. telepresence.

>From discussing the book and various aspects of what telepresence is
and what is necessary, he slips into a discussion about, "Why did
telepresence stop evolving 20 years ago?"  Funding, repetition in
work (everyone working on robot hands and computer vision), and
lack of centeralization (whoes doing what - this is STILL needed).
Interesting note, DARPA actually allocated some funding to work on
a powered armored suit like the ones in Heinlein's "Starship
Troopers".  Now, that would've been neat to see!

According to the article, J.C. Bliss and J.G. Linvill at Stanford
created a device that would translate print into "feel" enabling
blind people to read.  This gadget fits on your fingertips and
has many miniature photocells that sense light and little vibrators
that allow the finger to sense remotely the fine shape of letters.
What ever happened to this work?  Has anyone seen it?  Is anyone
still working with/on it?  This perhaps would be a good solution
to the tactile problem?  In addition, Mr. Minsky talks about one
of his graduate students, Danny Hillis, who fabricated a thin,
skinlike material that can "feel" and transmit small tactile
surface features.  Again, does anyone have any further info?

The article ends recommending interested individuals to find
"Remotely Manned Systems", edited by Ewald Heer (Caltech, 1973),
and "Human Factors Applications in Teleoperator Design and
Operations", by E.G. Johnsen and W.R. Corliss (Wiley, 1971).

Are we reinventing the wheel again?  Has anyone been working with
any of the ideas or systems previously developed?  Is there a
centralized power or database of individuals (past and present)
and the work that they are doing?  Wasn't the INTERNET setup
to promote the inchange of ideas?  Am I talking to myself? (-:

- Mike Almquist (almquist@brahms.udel.edu)

fink@acf5.NYU.EDU (Howard Fink) (04/17/91)

The lack of development of promising ideas is nothing new in this society,
or any other.  Fifty years before Columbus the Chinese sent an armada to
Africa, but political changes back home among many other factors  prevented
any further exploration.  Airbags in cars were developed and discussed in
the mid-sixties, but only now are they being installed in automobiles.  
Before rockets achieved orbital velocity, the upper atmosphere was being 
explored with sounding rockets, and proposals for ramjet-powered ships using
atomic oxygen (the size of aircraft carriers!) were being developed.
Once orbital velocity was achieved, the exploration of the exosphere was no
longer the priority.  A shuttle with a satellite on tether will fly in a few
years.
        What does this have to do with virtual-worlds or telepresence?  The
greatest funding for telepresence occured during the X-6 project, otherwise
known as the nuclear airplane.  General Electric had trucks with arms for 
refueling the airplane and servicing the engine.  A nuclear airplane could
patrol for weeks, or fly with unlimited range.  After $350 million of those
$35 per ounce of gold dollars were spent, ICBMs eliminated the need for these
planes.  One plane flew with a reactor aboard, but the reactor never supplied
power to the engines, which were never built.  Interestingly, GE is building
a reactor for flying in space, and is repeating the same problems: overweight,
overpriced, and late.
        If you want the biggest trove of tele-operated development materials,
check out GE's files.

cygnus@cis.udel.edu (Marc W. Cygnus) (04/22/91)

In article <1991Apr17.052444.17140@milton.u.washington.edu> almquist@brahms.udel
.edu (Squish) writes:
-> <etc deleted...>
->
->According to the article, J.C. Bliss and J.G. Linvill at Stanford
->created a device that would translate print into "feel" enabling
->blind people to read.  This gadget fits on your fingertips and
->has many miniature photocells that sense light and little vibrators
->that allow the finger to sense remotely the fine shape of letters.
->What ever happened to this work?  Has anyone seen it?  Is anyone
->still working with/on it?  This perhaps would be a good solution
->to the tactile problem? [...]

Interesting thought.  Yes, I believe it's still around.  Two observations:
(1) The device is meant to convey information about the shape of letters,
and accomplishes this via a matrix (if i remember correctly) of vibrating
pins.  The vibration helps the user to distinguish between letter and
non-letter.  (2) People who have used the device more than just occasionally
report a loss of fingertip sensitivity, presumably due to the repeated
"overstimulation" of the touch receptors.  [ I don't remember my source for
the latter, but it was verified in a conversation with a friend who works
with rehabilitative technologies at a local medical institute ]

-> [...] In addition, Mr. Minsky talks about one
->of his graduate students, Danny Hillis, who fabricated a thin,
->skinlike material that can "feel" and transmit small tactile
->surface features.  Again, does anyone have any further info?
->
-> <etc deleted...>
->- Mike Almquist (almquist@brahms.udel.edu)

Now that sounds cool.  What I would like in *my* version of reality is
true rough texture.  [ I know, might as well ask for a direct occipital
interface too, eh? ;-) ]

I wonder how one might approach true "sandpaper"-like texture... IMHO it
will not be duplicated with a technology that relies on an X by Y array
of elements which are raised or lowered in a Z direction to simulate small
tactile features (the Disney pin-box thing...).  Can it?  Better yet, has
it?

Of course, then there's the question of exactly how much "roughness" would
we want to simulate?  It might be a bummer to skin your elbows on the
virtual wall around which you're jumping.

But then again, maybe it might *not* be!  How far could tactile texture
technology (wow) go before the risks of injury got out of hand?  (now, it
*would* be a bummer to skin that elbow because of a code error somewhere)

                                        -marcus-
-- 
-----------------------------------------------------------------------------
"Opinions expressed above are not necessarily those of anyone in particular."
 UDel Artificial Life Group (Graphics Support)  |  INET: cygnus@cis.udel.edu
 114a Wolf Hall (Irisville) (302) 451-6993      | CompSciLab: (302) 451-6339

mg@godzilla.cgl.rmit.OZ.AU (Mike Gigante) (04/22/91)

One of our VR projects here is on Telepresence.

The performance artist Stelarc is working with us on a teleoperation
environment.

Stelarc is well known for his "third arm" performances, the third arm is
generally controlled by electrical impulses from electrodes attached
to various parts of the body (e.g. stomach). Flexing the stomach muscles
in a controlled manner then directly controlles the arm. His work has
been widely exhibited/performed, including SISEA in Holland last year, he
has also spent many years in Japan.

At the RMIT ACGC, he is working with Mike Papper and myself on the 
the control of an extended manipulator using gesture input.

This is a very new project, and we are still in the earliest stages.
I will post stuff about progress ina  few months time.

Just to update some previous messages, we also have a virtual clay modelling
project with Barry Fowler, Barry is working with the sculptor Robert Owen.
This project is also in early stages, much work still needs to be done
on the underlying software even before the goggles and gloves will be useful.

Mike Gigante
Director, RMIT Advanced Computer Graphics Centre
Royal Melbourne Institute of Technology
Melbourne, Australia

mg@godzilla.cgl.rmit.oz.au