[comp.sys.next] non-muscular interface

pts@watt.acc.Virginia.EDU (Paul T. Shannon) (12/31/88)

Eric Raymond writes: 
-- Better we should be working on neural-interface devices, not for the sensory
-- side (that's a very very hard problem) but for the motor-affector side (which
-- is a relatively easy one). Screw keyboards; it's already known that you can
-- quickly biofeedback-train people to spark hair-thin electrodes attached to
-- individual muscle fibers in the balls of their thumbs. This is the interface
-- technology we should be investigating, looking for a non-intrusive version.

I'm working on a program for dos pc's which is used by severely handicapped,
non-vocal students.  The aim is to allow some synthesized or sampled speech,
some recreation, and (perhaps) some environmental control for the students.
Right now the user employs a rather crude cheek switch to talk to the program;
but after I watch a half-hour of this, I always end up wishing that we could
find an easier switch.  Eric's mention of biofeedback and electrodes is
interesting, though in the case of cerebral palsy, it's unlikey that fine
muscle control is possible.  But does anyone have any ideas for non-muscular
control of a computer? At this early stage, we are only trying to read a
single switch, to indicate yes or no.

I realize that this request for information is well off the topic of the
next computer.  My apologies.

Paul Shannon
University of Virginia
pts@virginia (bitnet)