[comp.sys.mac.programmer] Timer

ps299ap@sdcc6.ucsd.edu (Ethan Montag) (10/16/90)

I am writing a program in THINK Pascal in which I would like
to time the user's response.  The tic stuff in Pascal is too slow.
I would like msec or faster resolution and tic is for 1/60th of 
a second.  The absolute units of time isn't critical either.
I am trying to time the interval from presentation of a picture
on the screen until the user presses the mouse button.  I already
have the button press part, I just need a timer.

If anybody has some code to do this I would appreciate it.  Please
e-mail me and name your price :^$.

I actually want to time the interval from the presentation of a picture
to the voice response of the user.  I am thinking of attaching a voice
activated relay to the mouse button input which is just a grounding of
two pins.  If anybody knows of a better way to do this please let me know.
Is there a better port for this?  Is there an inexpensive input card that
would do this better?


Ethan Montag
emontag@ucsd.edu.BITNET

tarr-michael@cs.yale.edu (michael tarr) (10/16/90)

Ug another psych person trying to get Millisecond accurate timing. Two
things -- for such a timer check in last years volume of Instruments,
Computers, etc... (the Blue psychonomics journal) for an article with
Doug Chute as first author on the Drexel MilliTimer. I can post the
reference if people are interested. Second, you are not really going to
get millisecond accuracy anyway using the ADB -- maybe you could with
the serial port, but I am skeptical -- better to buy an i/o board with
counter/timers on board. Or use ticks -- see a paper called "Good News
for Bad Clocks" that came out a few years ago. Again I can post the
reference if someone wants it. By the way I still don't have a way of
getting accurate ticks or screen refreshes on mac II monitors. I have
been sent several pieces of code in the past, but they don't seem to
work.

-- 
 * Mike Tarr                                    The Human Neuron Project  *
 * tarr@cs.yale.edu                             Department of Psychology  *
 * "My opinions are always my own."             Yale University           *
 **************************************************************************

ts@cup.portal.com (Tim W Smith) (10/18/90)

If you don't care about the absolute units of time, how about just
using a loop:

	for ( count = 0; ! Button(); count++ )
		;

scott@scotty (Scott Howard) (10/18/90)

ts@cup.portal.com (Tim W Smith) writes:

>If you don't care about the absolute units of time, how about just
>using a loop:

>	for ( count = 0; ! Button(); count++ )

BAD IDEA!
Why not simply wait 'x' number of TickCounts?

tarr-michael@cs.yale.edu (michael tarr) (10/18/90)

Here for all that are interested are the two references I mentioned:


Westall, R. F., Perkey, M. N., & Chute, D. L. (1989).  Millisecond
	timing on the Apple Macintosh: updating Drexel's millitimer.  Behavior
	Research Methods, Instruments, & Computers, 21 (5), 540-547.

Ulrich, R., & Giray, M. (1989).  Time resolution of clocks: effects on
	reaction time measurement -- good news for bad clocks.  British Journal
	of Mathematical and Statistical Psychology, 42, 1-12.

Note that if you are using an ADB input device it is unlikely you are
getting millisecond precision anyway. For my own software I have two
strategies for measuring reaction times:

1. For ADB input devices: Run w/o multifinder and use an eventmask that
only looks for events from the device subjects are responding on. Use
the Event.when field to find out when the subject responded in ticks.
Note that there are 60.15 ticks per second (see IM V). This number is
fixed and is no longer tied to the screen refresh rate. If you need the
videosync to draw flicker free images, you need to use a VBL task which
is specific to your monitor (I still don't have this part working --
help please!).

2. To get millisecond accuracy: Buy a digital I/O card like that
available from Strawberry Tree which has counter/timers which can be
latched to the i/o channels -- this way you get REAL millisecond
precision since the timer stops when the channel goes high (e.g. your
subject presses a key connected to the channel).

Many psychologists are too hyper about millisecond precision -- given
the distributions of most RT's this is unnecessary. Plus few if any are
actually coaxing msec precision out of the computers -- but then they
report it as msecs cause the reviewers are saying "I don't trust your
data unless it is msec precise." Remember millisecond precision is just
as arbitrary as p < .05!!! It is doubtful brain function has little to
do with msecs. (Although it might be 50hz ticks if you believe Francis
Crick and Kristof Koch!!!).
-- 
 * Mike Tarr                                    The Human Neuron Project  *
 * tarr@cs.yale.edu                             Department of Psychology  *
 * "My opinions are always my own."             Yale University           *
 **************************************************************************

francis@arthur.uchicago.edu (Francis Stracke) (10/19/90)

In article <1990Oct18.032201.27217@ux1.cso.uiuc.edu> scott@scotty (Scott Howard) writes:
>ts@cup.portal.com (Tim W Smith) writes:
>
>>If you don't care about the absolute units of time, how about just
>>using a loop:
>
>>	for ( count = 0; ! Button(); count++ )
>
>BAD IDEA!
>Why not simply wait 'x' number of TickCounts?
>

I believe the original poster said that the whole problem was that ticks
were to coarse a measure.

| Francis Stracke		| My opinions are my own.  I don't steal them.|
| Department of Mathematics	|=============================================|
| University of Chicago		| A mathematician is a professional	      |
| francis@zaphod.uchicago.edu	|   schizophrenic.--Me.		       	      |

mbabramowicz@amherst.bitnet (10/19/90)

I think that it is possible to get better-than tick accuracy without installing
any new hardware or relying on certain model Macintoshes.

(Note: I haven't read the articles referenced previously, so perhaps someone
has already written about this.)

The for loop that someone suggested earlier is actually useful, with
modifications:

	Time = Ticks();
	for (count = 0; (count<= 2000) && (!Button()); count++);
	Time = Ticks() - Time;
	oneTick = 2000/Time;

The variable oneTick is now set equal to the number of times you go through
the for loop in a Tick.

Then, if you want to track user reaction time, you use the same for loop as
before. (To preserve the exactness of your count, be sure to keep in the
count<=2000 line since that computation takes some time.  You can make it
count<= something big if you want the user to be able to take longer than
2000.)

When the user hits the mouse button, the loop exits.
count / oneTick * 60 should now give the time the loop took, in seconds.
Dividing by 1000 gives milliseconds.

Note of course that for accuracy oneTick should be a float or a double.

If you use a number higher than 2000 you will obtain greater accuracy.

This will work on any Macintosh, unless they make a Macintosh so fast that it
can do 2000 iterations in less than a tick. (Maybe some now can do this.  I
don't really know.  If so, just use 100,000 or a 1,000,000.)

One little annoying thing about this method is what happens if the user presses
the mouse button while you are doing your test? I haven't worked that out yet,
but I doubt that a workaround would be complex.

I haven't actually tried this out; this is all speculation; all disclaimers
apply.

Michael Abramowicz
Amherst College

isr@rodan.acs.syr.edu (Michael S. Schechter - ISR group account) (10/22/90)

In article <10719.271f09c1@amherst.bitnet> mbabramowicz@amherst.bitnet writes:
>	Time = Ticks();
>	for (count = 0; (count<= 2000) && (!Button()); count++);
>	Time = Ticks() - Time;
>	oneTick = 2000/Time;
>
>The variable oneTick is now set equal to the number of times you go through
>the for loop in a Tick.
      This loop has variable time to execute due to
      interupts occurring during it. And, yes, these interrupts
      take up of **lots** of time at times.
      Even if a machine is sitting idle you still have some
      interrupts in there taking up unknown amount of time.
      (on a II, there's something that takes up  80-120usec
      and happens at least a few times every second)
      and that's without even worrying about what happens when the
      mouse moves (more interupts)

-- 
Mike Schechter, Computer Engineer,Institute Sensory Research, Syracuse Univ.
InterNet: Mike_Schechter@isr.syr.edu isr@rodan.syr.edu Bitnet: SENSORY@SUNRISE