[comp.human-factors] Eye Movement Trackers

mgreen@cs.toronto.edu (Marc Green) (06/15/91)

>From: asylvain@felix.UUCP (Alvin "the Chipmunk" Sylvain)

>How about tossing that old mouse into the trash can, and construct a
>device which looks into your eyes while you're reading the screen, and
>can actually track the precise location of what you're looking at?

>Rather than "clicking" a mouse, you just touch a button when you want to
>select the word or screen-button that's in your current "gaze."  Natur-
>ally, a cursor follows your gaze around the screen, and touching the
>button flashes the cursor (or inverts the screen button, whatever) for
>immediate feedback.

>Such devices already exist, but they require physical contact with the
>eyeball.  This is undesireable (at least to me!)

>This may sound rather like science-fiction, but it's probably possible.

>Don't forget, if anybody _does_ invent such a thing, you heard it here
>first!

You obviously don't know much about the world of eye trackers. There
are numerous eye trackers which do not make contact with the eye. The
king of them all is the SRI double Purkinje tracker which detects
changes in the position of the Purkinje images in the the eye.
(Purkinje images are create by the different refractive indices of
different ocular media. All you need is about $60K and you have one of
your very own. There are also cheaper methods, glasses that bounce
infrared beams of the eye, EOG's which measure activity in the
ocular-motor muscles, etc. These have problems with accuracy and
reliability and many operate only in the horizontal plane.

All trackers require careful and frequent calibration.  Further, It is
not easy to tell where a person is looking, even if you know the
position of the pupil; there is a big difference between knowing the
position of the eye and the locus of gaze. People also make many
involuntary eye movements, so a tracker would unintentional cause
actions to occur. I don't think that eye trackers will ever become
popular. Just too many problems.

Marc Green
Trent University

aipdc@castle.ed.ac.uk (Paul Crowley) (06/15/91)

In article <91Jun14.160659edt.6227@neat.cs.toronto.edu> mgreen@cs.toronto.edu (Marc Green) writes:
>All trackers require careful and frequent calibration.  Further, It is
>not easy to tell where a person is looking, even if you know the
>position of the pupil; there is a big difference between knowing the
>position of the eye and the locus of gaze. People also make many
>involuntary eye movements, so a tracker would unintentional cause
>actions to occur. I don't think that eye trackers will ever become
>popular. Just too many problems.

I've come up with an idea for an eye tracker that's probably impractical
but appeals to me because if it worked it would be simple and cheap and
work better than the other options.  It goes something like this: 
attach electrodes to the back of the brain (where the vision processing
goes on) and momentarily disturb one half of the screen, then the other
half.  The response will be markedly stronger in the half in which the
user is looking (I have seen this, and it's true).  Split that half into
halves again and repeat until a small box has been marked off.

There are three serious problems with this.  One, looking at a display
that flickers over and over to find out where you're looking would be a
pain in the neck.  Second, it might be hard to measure the response,
although it's so obvious to look at that it might not be too hard. 
Third, the halving process might take too long.  It can be made slightly
faster by using thirds instead of halves, but even then fourteen tests
are needed, meaning 42 disturbances to get your viewpoint down to 1 in a
million pixels.  I don't know how long it would take to disturb the
screen and measure the response, so this could take anywhere between a
tenth of a second and a minute.  Oh, and you have to attach electrodes
to the back of your neck, although this is not as much hassle as you
might think - just don't do it if you need to look your best... 

I think that eye trackers have enormous potential if combined with
rudimentary speech recognition.  Speech recognition software exists that
can accept simple commands with close on 100% reliability, and we can
pick the right word faster than we can pick the right mouse button, and
with a greater range.  If you have an Athena-Widget style "highlight
when the eye is in the box" then people will usually do what they meant
to, and you have a natural and _very_ fast user interface.
                                         ____
\/ o\ Paul Crowley aipdc@castle.ed.ac.uk \  /
/\__/ Part straight. Part gay. All queer. \/
"I say we kill him and eat his brain."
"That's not the solution to _every_ problem, you know!" -- Rudy Rucker

pilgrim@daimi.aau.dk (Jakob G}rdsted) (06/16/91)

mgreen@cs.toronto.edu (Marc Green) writes:
>king of them all is the SRI double Purkinje tracker which detects
>changes in the position of the Purkinje images in the the eye.
>(Purkinje images are create by the different refractive indices of
>different ocular media. All you need is about $60K and you have one of
>your very own. There are also cheaper methods, glasses that bounce
>infrared beams of the eye, EOG's which measure activity in the
>ocular-motor muscles, etc. These have problems with accuracy and
>reliability and many operate only in the horizontal plane.

>All trackers require careful and frequent calibration.  Further, It is
>not easy to tell where a person is looking, even if you know the
>position of the pupil; there is a big difference between knowing the
>position of the eye and the locus of gaze. People also make many

How do they cope with the relative position of the head and the screen?
Am I to hold my head absolute still?

>involuntary eye movements, so a tracker would unintentional cause

Consider the aforementioned cursor. Are you saying, that just as I am
about to press the activate button, my eyes may flicker? (i.e. can one
have trouble "fixing" ones looking direction)

>actions to occur. I don't think that eye trackers will ever become
>popular. Just too many problems.

I hope they will become popular, and I expect our technology to advance
so as to solve this. It is just too great an interface to miss...

>Marc Green
>Trent University
--
From the notorious
                      Jakob Gaardsted, Computer Science Department
Bed og arbejd !            University of Aarhus,  Jylland (!)
(Pray and work!)  AMIGA!  pilgrim@daimi.aau.dk | I'd rather play Moria.

rpotter@grip.cis.upenn.edu (Robert Potter) (06/17/91)

In article <91Jun14.160659edt.6227@neat.cs.toronto.edu>, mgreen@cs.toronto.edu (Marc Green) writes:
>... there is a big difference between knowing the position of the eye and the
>locus of gaze.

Huh?  How can the locus of gaze move independantly of the eyeballs?

-- 
Robert Potter                               rpotter@grip.cis.upenn.edu
GRASP laboratory, Univ. of Pennsylvania

bmacinty@mud.uwaterloo.ca (Blair MacIntyre) (06/17/91)

>>>>> rpotter@grip.cis.upenn.edu (Robert Potter) wrote:

Robert> In article <91Jun14.160659edt.6227@neat.cs.toronto.edu>,
Robert> mgreen@cs.toronto.edu (Marc Green) writes:
>... there is a big difference between knowing the position of the eye
>and the locus of gaze.

Robert> Huh?  How can the locus of gaze move independantly of the eyeballs?

Perhaps he's thinking of the position (xyz) of the eyeballs vs. the
position and orientation (xyzpyr) (thats pitch/yaw/roll) of the
eyeballs.

No, I can't picture the locus of gaze moving independantly of the
eyeballs, but I can think of when it may come in handy!
--
Blair MacIntyre, Computer Graphics Lab
Dept. of Computer Science, University of Waterloo, Waterloo, ON, Canada, N2L3G1
{bmacintyre@{watcgl|violet|watdragon}}.{waterloo.edu|uwaterloo.ca}