doug@eris.berkeley.edu (Doug Merritt) (02/07/91)
Has anyone been tracking recent research into EEG (or SQUID) analysis that might be used as an effective computer input interface someday? I'm aware that some crude work has been done long since on e.g. picking the letter of the alphabet currently underneath the endlessly moving cursor by biofeedback of alpha waves. But what about higher bandwidth or more naturally targetted input? I recall reading in the late sixties or early seventies that there was an evoked potential (P40 or something) that had been determined to be naturally produced whenever someone made a clear cut decision to act. The example given was of a subject deciding to turn on a lamp. They could reliably sense the potential and turn on the lamp in the split second before the subject started to move his arm. (They further speculated that it might be used someday as a fire control in a fighter jet during high-gee turns, a concept picked up crudely in the movie Firefox. And one that the Air Force may actually use, for all I know.) In virtual reality terms, this particular potential might be used in conjunction with eye tracking in order to mentally command a selection, or "move me there", or any similar single action. The Feb. Scientific American article by Freeman about strange attractors, and unique EEG amplitude maps in response to recognition of sensory perceptions, makes me wonder if it might now be possible to sense a broad enough array of natural data from the brain to use as an effective input device. For example, if we could deduce from an EEG of the motor cortex that the user was visualizing a clenched fist, that could be used as a "grasp object" command, without even needing a data glove. Even if things are a little cruder than that, I would think that there are, at least, a variety of readily identifiable evoked potentials that have been studied that might be usable for some useful set of commands, even if it still needed a data glove and eye tracking to back it up. Anyone know about this? Or know of a summary/survey source on evoked potentials and EEG analysis that sums up recent research? A similarly interesting possibility is that of using higher-temperature superconductors to make SQUID's that could monitor magnetic fields on the *interior* of the brain, potentially yielding even more information. (EEG's pick up only surface currents, limiting the portions of the brain that might be analyzed even in principle.) Any new info there? Doug -- Doug Merritt doug@eris.berkeley.edu (ucbvax!eris!doug) or uunet.uu.net!crossck!dougm
tsarver@uunet.UU.NET (Tom Sarver) (02/07/91)
Doug Merritt in Message-ID: <1991Feb6.183330.8154@agate.berkeley.edu> says >Has anyone been tracking recent research into EEG (or SQUID) analysis >that might be used as an effective computer input interface someday? [Gives some examples] >Anyone know about this? Or know of a summary/survey source on evoked >potentials and EEG analysis that sums up recent research? Yes, I have heard of such on-going research. When I left University of Florida in 1988, there was a research center organized under the Engineering College called the Center for Man-Machine Studies (or something like that). The name actually sounded like HCI, but in fact they did some of the EEG, brainwaves kinda stuff. If you want more info here's a partial address which may get there: Dean of the College of Engineering University of Florida CSE Building Gainesville, FL 32611 Hope this helps, --Tom -%-%-%-%-%-%-%-%-%-%-%-%-%-%-%-%-%-%--%-%-%-%-%-%-%-%-%-%-%-%-%-%-%-%-%-%-%-%-% % Tom Sarver: tsarver@andersen.com | "Only Amiga makes it possible!" //\ % % "A real computer has a linear address space. NO 386's!!" \\ //--\ % -%-%-%-%-%-%-%-%-%-%-%-%-%-%-%-%-%-%--%-%-%-%-%-%-%-%-%-%-%-%-%-%-%-%\X/%-%-\-%
basiji@milton.u.washington.edu (David Basiji) (02/07/91)
Working on it... There are a variety of bioelectric potentials that convey information through the body, EEG being one of them. EEG's are extremely complex and are difficult to analyze in real time, though there are promising techniques for doing so. You can also measure the potentials from peripheral nerves and muscles which are prefiltered and amplified to be more suited to their specific (often motor) action. Evoked potentials suffer from the fact that your body adapts to continuous stimuli and that the process of evoking them can be annoying to the subject. I'll post more when I get some better data. David Basiji basiji@u.washington.edu
mcarpenter@hmcvax.claremont.edu (Matt Carpenter) (02/07/91)
In article <1991Feb6.183330.8154@agate.berkeley.edu>, doug@eris.berkeley.edu (Doug Merritt) writes... >Has anyone been tracking recent research into EEG (or SQUID) analysis >that might be used as an effective computer input interface someday? Well, about a year ago I remember reading an article which I believe talked of using EEG for tracking eye motion and hooking this up as input for a computer. If I remember correctly, there was a handicapped individual that was successfully using such a system for operation of a computer. I think this article appeared in an issue of PC/Computing towards the end of 1989 or the beginning of '90, and I also believe the same issue contained an article on VR. (I just looked at a list of VR articles I got off this news group, and there was an article on VR in the Nov. '89 PC/Computing, so check this issue first). I can't really remember much more about the article, but hopefully the above info will help some. Matt ------------------------------------------------------------------------------- Matt Carpenter mcarpenter@hmcvax.bitnet or mcarpenter@hmcvax.claremont.edu
rdees@umiami.ir.miami.edu (Matthion) (02/09/91)
In article <1991Feb6.183330.8154@agate.berkeley.edu>, doug@eris.berkeley.edu (Doug Merritt) writes: > > ... > > The Feb. Scientific American article by Freeman about strange attractors, > and unique EEG amplitude maps in response to recognition of sensory > perceptions, makes me wonder if it might now be possible to sense a > broad enough array of natural data from the brain to use as an effective > input device. For example, if we could deduce from an EEG of the motor > cortex that the user was visualizing a clenched fist, that could be > used as a "grasp object" command, without even needing a data glove. > > ... > > Anyone know about this? Or know of a summary/survey source on evoked > potentials and EEG analysis that sums up recent research? > > A similarly interesting possibility is that of using higher-temperature > superconductors to make SQUID's that could monitor magnetic fields > on the *interior* of the brain, potentially yielding even more information. > (EEG's pick up only surface currents, limiting the portions of the > brain that might be analyzed even in principle.) Any new info there? > Doug First, I should point out that several people (whose opinions I deeply respect) have come to the conclusion that such direct neural interfaces are possible due to the intrinsic variability between people. Although the systems are chaotic in the sense that as you go higher patterns emerge, attacks at the root level are still attacks on a random system. The question phrased differently is: "will we ever be able to detect enough useful information in the high order signal to effectively get closer to a person's thoughts?" This is a very new area of inquiry, we will most likely have a bit of a wait... As for looking deeper in the brain, do we want to? As you descend into the brain, you move backwards in time, through all the mammals, then reptiles, then back to species that don't even exist anymore. Interfacing at these levels will probably be closer to the low end interfacing I mentioned above, although as we probe, we will undoubtedly uncover patterns of processing activity there as well. So...does anyone out there have some up to date data on evoked potentials? -- =========================================================================== (__) | Matthew Augustus Douglas Turner ^^ (oo) | ^^^^ /-------\/ | Department of Somthing or Other ^^^^^ / | || | College of Arcane Arts (A.A.) ^^^^^ * ||----|| | The University of Miami ^^^^^^^^ ====^^====^^==== | ^^^^^^^^^^^^^/ ^^^^ | rdees@umiami.ir.miami.edu ^^^^^^^^^^^^^^^^^^^^^^^^^^ | ...and elsewhere ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | Cow Hanging Eight at Malibu | Analytical Engines Incorporated ============================================================================
doug@eris.berkeley.edu (Doug Merritt) (02/13/91)
In article <1991Feb8.164913.7787@umiami.ir.miami.edu> rdees@umiami.ir.miami.edu (Matthion) writes: > >First, I should point out that several people (whose opinions I deeply respect) >have come to the conclusion that such direct neural interfaces are >[im]possible due to the intrinsic variability between people. First, how about getting those people to give us some good bibliographic references so that we could understand the nature of that problem? Second, I assume that some such intrinsic variability exists (it would be an incredible stroke of luck were it otherwise), but that's ok...I'd be quite happy to have a direct neural interface that required (say) a month of training to begin using. Or even a year, if that was the best we could do. This is not at all a reason to give up on it. Note also that interfaces that are unique to a particular individual might have a side benefit of giving inherent casual security to that unit. >Although the systems are chaotic in the sense that as you go higher patterns >emerge, attacks at the root level are still attacks on a random system. I'm not sure what "higher" and "root" refers to here; can you clarify? >As for looking deeper in the brain, do we want to? As you descend into the >brain, you move backwards in time, through all the mammals, then reptiles, then Yes, we probably would want to. For instance, if you wanted to get a computer image of exactly what the subject was looking at, it's quite possible that this information is available only in deeper, older parts of the brain. (Unless it just happens to be available on the surface part of the visual cortex.) If you wanted to track the precise emotional state (mood) of the subject at all times, this almost certainly would require monitoring the deeper parts of the brain. -- Doug Merritt doug@eris.berkeley.edu (ucbvax!eris!doug) or uunet.uu.net!crossck!dougm
matth@mars.njit.edu (Matthew Harelick) (02/14/91)
Wouldn't the development of neural interfaces be dangerous. If you develop equipment that can "read minds" , you could develop equipment that can read specific parts of a persons brain, thereby invading his privacy etc. - Matt
wjbaird@dahlia.uwaterloo.ca (Warren Baird) (02/15/91)
In article <16410@milton.u.washington.edu> matth@mars.njit.edu (Matthew Harelick ) writes: > > Wouldn't the development of neural interfaces be dangerous. If you develop >equipment that can "read minds" , you could develop equipment that can read >specific parts of a persons brain, thereby invading his privacy etc. I don't think this is what is meant when most people refer to 'neural interfaces'. I'm no expert, but when I think of neural interfaces, I think of a machine that can tell when I'm trying to raise my arm, or flex my hand, and can then overwrite neural signals, to make it appear (to me) that I am picking up something heavy. Or a machine that can project visual stimuli directly onto either the visual cortex, or (preferrably) the optic nerves. I think that there is a BIG gap between doing that, and reading someone's thoughts... I do agree that a machine that could actively read the thoughts of a person would be a very dangerous tool... Warren > >- Matt -- Warren Baird, 2A Co-op Math Computer Science, U(Waterloo) wjbaird@dahlia.uwaterloo.ca ...utzoo!watmath.uwaterloo.edu!dahlia!wjbaird An elephant is a mouse with an operating system.
matth@mars.njit.edu (Matthew Harelick) (02/15/91)
Hello Warren, In order for a machine to overwrite neural signals and create a mental illusion for the user , it is necessary to read the state of the neural signals at the time. In addition, it will be necessary to know how to interpret the state of the neural signals. Also, overwriting neural signals in someones mind can lead to technology which will allow computer programmers to plant suggestions in a users mind by overriding specific neural signals. The direction of this research if further abused could lead to such things as mind control... - Matt
minsky@media-lab.MEDIA.MIT.EDU (Marvin Minsky) (02/16/91)
In article <16473@milton.u.washington.edu> wjbaird@dahlia.uwaterloo.ca (Warren B aird) writes: >I do agree that a machine that could actively read the thoughts of a >person would be a very dangerous tool... A brick is a very dangerous tool. What isn't? But a mind-reading machine would be a path to straightening out minds. People could read their own minds, find the bad stuff in them, and fix them (using the utility programs developed for that purpose). Mind-reading very likely leads to downloading and immortality as well. I presume that it has been pointed out that any prejudice that stands in the way of indefinite life-extension is more dangerous than anything else. So please be careful not to keep thinking such unspeakably dangerous thoughts ;-)
cyberoid@milton.u.washington.edu (Robert Jacobson) (02/17/91)
I don't know about "mind-reading" per se, but more than one psychologist has commented on the value that virtual-world art therapy could have for therapists. One could, in a sense, "enter" the intellect of another by traversing the virtual world that the patient/client creates. It might take a strong heart to tolerate what one might see -- art therapy, even in its conventional painting and sculptural forms, often produces harsh and threatening results. But the insights (a fine word, here) could be remarkable. And, being interactive, an art therapy world might permit new types of in-world therapeutic regimes. Bob Jacobson HIT Lab Seattle
mg@godzilla.cgl.rmit.OZ.AU (Mike Gigante) (03/06/91)
cyberoid@milton.u.washington.edu (Robert Jacobson) writes: >I don't know about "mind-reading" per se, but more than one psychologist >has commented on the value that virtual-world art therapy could have for >therapists. One could, in a sense, "enter" the intellect of another >by traversing the virtual world that the patient/client creates. It might >take a strong heart to tolerate what one might see -- art therapy, even in >its conventional painting and sculptural forms, often produces harsh and >threatening results. But the insights (a fine word, here) could be >remarkable. And, being interactive, an art therapy world might permit >new types of in-world therapeutic regimes. Well, as is often the case, fiction (esp. science fiction) foresees ideas before the technology is able to sustain them. What follows should be mandatory reading for any student of VR. (I would argue well above cyberspace genre on the reading list) I cannot tell you the exact name/title because I don't have my books with me at the moment, but there is a Nebula award winning story called "Shaper" (or a title containing the word shaper) by I think Roger Zelazny about a therapist who shapes a virtual world in which sight, smell, taste and emotion are used as part of the therapy process. `He who shapes' is actively controlling the complete environment in a sort of mind meld. Control is via some kind of ultra complex "keyboard"/"deck" where all the stimulations are controllable by sliders/knobs. The story, which is a good one, centres about providing hitherto unseen (visual) responses to an extremely strong-willed women who was blind from birth. The patient sits (naked) in an womb-like egg capsule. I can't recall whether or not there were direct brain taps or what. The opening of the story shows the ideal therapy using such a tool. What happens with our strong-willed heroine, I'll leave for your reading enjoyment. You can find the story in a TOR collection of Nebula award winning stories. If no-one else can fill in the gaps in my memory I'll post the details next week [The book is at a beach house I am heading to this weekend]. Mike Gigante ACGC, Royal Melbourne Institute of technology Melbourne, Australia