[sci.virtual-worlds] Neural Interfacing and VR

keithley@applelink.apple.com (Craig Keithley) (10/25/90)

Neural Interfacing (and its application to VR) might be here sooner than 
you think.

Some 15 years ago a UCLA research project successfully used VERY low end 
minicomputers to detect specific thoughts.  The basic idea was/is that it 
possible to pattern match a properly filtered set of brain waves to a 
previously captured sample.  When youUve got a good match, you can respond 
to the command in a manner similar to voice recognition.  

I recommend that you go to your local university library, and borrow the 
thesis entitled:
    Computer System Architecture for a Brain Computer Interface
    Ronald Olch
    1975(?)

RonaldUs master thesis was pretty dry reading about the actual physical 
layout of the lab and and the kinds of  minicomputers, punch tape readers, 
plotters, cabling, etc, etc.  There was a notable absense of specific 
algorhythms and sample code.  I did chuckle a bit when I read a paragraph 
describing what the coming powerful (in 1975?) microprocessors would do to 
improve this process of recognizing thoughts.  Its 1990, can you say DSP?

Having read this, I managed to track Mr. Olch down and interogated :-) him 
about the actual mechanics of recognizing specific thoughts.   Very 
Interesting stuff.  Its not as hard as you might think (pun intented!).  
Mr. Olch is no longer involved with this research... But he did tell me 
that the Principal Researcher was a Jaques (sp?) Vidal.   While Jaques 
hasnUt published under his name, he has been a co-authored a variety of 
papers which have a lot to do with where VR/Neural Interfacing is going.

If anyone has any more thoughts and/or questions, send them to me 
directly... ItUll keep the neural noise on sci.virtual-worlds to a 
reasonable db.



Craig Keithley, Apple Computer
keithley@applelink.apple.com 
keithley@apple.com
ITS NOT MY FAULT! [standard disclaimer]

FC137501@YSUB.YSU.EDU (Paul M. Mullins) (10/29/90)

Specific references follow...

> From: keithley@applelink.apple.com (Craig Keithley)
> Neural Interfacing (and its application to VR) might be here sooner than
> you think.
>
> Some 15 years ago a UCLA research project successfully used VERY low end
> minicomputers to detect specific thoughts.  The basic idea was/is that it
> possible to pattern match a properly filtered set of brain waves to a
> previously captured sample.  When youUve got a good match, you can respond
> to the command in a manner similar to voice recognition.
>
> I recommend that you go to your local university library, and borrow the
> thesis entitled:
>     Computer System Architecture for a Brain Computer Interface
>     Ronald Olch
>     1975(?)
>
> Mr. Olch is no longer involved with this research... But he did tell me
> that the Principal Researcher was a Jaques (sp?) Vidal.   While Jaques
> hasnUt published under his name, he has been a co-authored a variety of
> papers which have a lot to do with where VR/Neural Interfacing is going.
>
> Craig Keithley, Apple Computer

I did check up on some of this work for a related purpose - measurement of
stress to guage user satisfaction with features of the computer interface
(submited for publication).  The specific references for (some of) J.J. Vidal's
work are (from my notes, originally Comp & Info Syst Abstr):

"Toward direct brain-computer communication,"  J.J. Vidal.  Annual Review
of Biophysics and Bioengineering, Vol 4, p. 157.

"Biocybernetic control in man-machine interaction" (AD-777 720/4GA),
J.J. Vidal.  NTIS, Sprinfield, VA, April 1974, 101 pages.

Same, (N77-15662), J.J. Vidal, M.D. Puck, R.J. Hickman, R.H. Olch.
NTIS, Springfield, VA.  March 1976, 99 pages.

Same, (N77-15663), by Same.  NTIS, Springfield, VA.  September 1975, 82 pp.

"Real-time detection of brain events in EEG," J.J. Vidal.  IEEE Proceedings,
Vol. 65 (5), May 1977, pp. 633-641.

"Generation of ECG prototype waveforms by piecewise correlation averaging,"
Hecht and Vidal.  IEEE Trans on Pattern Analysis & Machine Intelligence,
Vol PAMI-2 (5), September 1980, pp 415-420.

I must admit that I didn't track down all these references since it soon
became obvious that there is a lot of this sort of research.  Most of it
is found in Psychology libraries under the title "evoked potentials."  It
is used for various purposes ranging from measurement of stress and mental
workload (MACINTER has some related articles) to measurement of visual
accuity in children.  It is also used to monitor patients during surgery from
remote sights (Sclabassi, PhD, MD at U Pitt was doing some of this type
of work.

Paul Mullins   FC137501@YSUB.YSU.EDU

frerichs@ux1.cso.uiuc.edu (11/01/90)

These are some pretty OLD references...

are there that are any more current?

-dfRERICHS
Dept of CompEng, UIUC.

FC137501@YSUB.YSU.EDU (Paul M. Mullins) (11/01/90)

> OLD references ...

Yup, but that doesn't make the research any less valid.  The specific
references were in support of the previous post which mentioned the
same project.

There are literally volumes of recent work, although not directly aimed
at "neural interfaces."  Any general text on psychophysiology should get
you started - I used "Techniques in Psychophysiology," by Martin & Venables
as a foundation.  That isn't really where my search started (or ended)
because I had a particular interest in physiological indicators of stress.

I can assure you that any library will contain references concerning
psychophysiology, evoked potentials, and bioengineering (medical monitoring
equipment).  For the really current stuff you would need a researcher in one
of those areas (or something related that I have forgotten).  You may be
surprised how much has been done with monitoring stress and workload at the
user interface (mostly in Europe).
A keyword search or just asking the librarian is likely to be far more
useful than my specific (pointed) references, especially since I perused
a lot that didn't even get noted.

Disclaimer - I am NOT an expert on these topics (as I indicated previously).
I was intrigued by a possible use of one aspect of these techniques in my
own area (user interface design/analysis) and did a little background work.
Actually more like a couple of months, but that isn't much compared to those
working in these areas.  Also, I was interested only in nonintrusive
techniques.  There is also a large body of work which requires needles and
such.  Wafers and neurons???  I couldn't get enough volunteers.

Any real experts out there that can pick up this thread?  If not, we can
try sci.med and sci.psychology (maybe comp.cog-eng).

Paul Mullins  FC137501@YSUB.YSU.EDU

lmiller@aerospace.aero.org (Lawrence H. Miller) (11/06/90)

In article <9963@milton.u.washington.edu> keithley@applelink.apple.com (Craig Ke
ithley) writes:
>
>
>Neural Interfacing (and its application to VR) might be here sooner than 
>you think.
>
>Some 15 years ago a UCLA research project successfully used VERY low end 
>minicomputers to detect specific thoughts.  The basic idea was/is that it 
>possible to pattern match a properly filtered set of brain waves to a 
>previously captured sample.  When youUve got a good match, you can respond 
>to the command in a manner similar to voice recognition.  

        The person in charge was Dr. Jacques Vidal.  His lab
        was called the brain-computer interface lab.  Here is a
        relevant publication (refer format):

%A Jacques J. Vidal
%T Real-Time Detection of Brain Events in EEG
%J Proceedings of the IEEE
%V 65
%N 5
%D May, 1977
%P 633-641

        It is important to emphasize that this work did not
        detect "thoughts" at all, and was never claiming that
        it did.  Rather it detected "event-related potentials",
        the events being a stimulus--a light flashing, and
        based on which direction you were LOOKING (not thinking
        about looking, but actually looking), the project
        performed pattern recognition techniques to determine
        the gaze direction from the evoked EEG signal.

Larry Miller
Aerospace Corporation
lmiller@aerospace.aero.org
213-336-5597