[comp.sys.ibm.pc] Turbo Pascal 5.0 and GetTime

chen-holly@CS.Yale.EDU (Holly Chen) (12/16/88)

I'm running a psychology experiment in which I time people's responses.  I
used SetTime and GetTime in Turbo Pascal 5.0 to time the subjects.  Although
the manual suggests that the timing is good to the hundredth of a second, it
appears that the timing is not this accurate.  The system does not appear to
increment the hundredths of a second at even intervals.  Can anybody explain
to me why the system clock seems to time inaccurately?  Does anybody know a
good way to time people with this kind of accuracy?

						- Holly

spolsky-joel@CS.YALE.EDU (Joel Spolsky) (12/16/88)

In article <45782@yale-celray.yale.UUCP> chen-holly@CS.Yale.EDU (Holly Chen) writes:
>I'm running a psychology experiment in which I time people's responses.  I
>used SetTime and GetTime in Turbo Pascal 5.0 to time the subjects.  Although
>the manual suggests that the timing is good to the hundredth of a second, it
>appears that the timing is not this accurate.  The system does not appear to
>increment the hundredths of a second at even intervals.  Can anybody explain
>to me why the system clock seems to time inaccurately?  Does anybody know a
>good way to time people with this kind of accuracy?

Holly,

The PC's clock is only updated 18 times per second, so the "time" that
the system gives you is only accurate to +/- 55 milliseconds. 

There are two ways to get finer resolution:

1. Buy a timer card ("official" solution)
2. Write convoluted code that relies on the fact that a certain loop
takes x milliseconds to execute, then cheat the compiler, somehow,
into not optimizing out the loop (awful "hackers" solution that is
guaranteed to break, severely, when you change compilers, DOS
versions, hardware, or heck probably even the contrast on the monitor).

+----------------+----------------------------------------------------------+
|  Joel Spolsky  | bitnet: spolsky@yalecs.bitnet     uucp: ...!yale!spolsky |
|                | internet: spolsky@cs.yale.edu     voicenet: 203-436-1483 |
+----------------+----------------------------------------------------------+
                                                      #include <disclaimer.h>

Ralf.Brown@B.GP.CS.CMU.EDU (12/16/88)

In article <45794@yale-celray.yale.UUCP>, spolsky-joel@CS.YALE.EDU (Joel Spolsky) writes:
}Holly,
}
}The PC's clock is only updated 18 times per second, so the "time" that
}the system gives you is only accurate to +/- 55 milliseconds. 
}
}There are two ways to get finer resolution:
[...]

3.  On an AT, enable the real-time clock's 1024/sec interrupt to get
approximately millisecond resolution.


--
UUCP: {ucbvax,harvard}!cs.cmu.edu!ralf -=-=-=- Voice: (412) 268-3053 (school)
ARPA: ralf@cs.cmu.edu  BIT: ralf%cs.cmu.edu@CMUCCVMA  FIDO: Ralf Brown 1:129/31
			Disclaimer? I claimed something?
	You cannot achieve the impossible without attempting the absurd.

dave@elandes.UUCP (D. Mathis) (12/16/88)

In article <45794@yale-celray.yale.UUCP>, spolsky-joel@CS.YALE.EDU (Joel Spolsky) writes:
> In article <45782@yale-celray.yale.UUCP> chen-holly@CS.Yale.EDU (Holly Chen) writes:
[ text deleted ]
> >the manual suggests that the timing is good to the hundredth of a second, it
> >appears that the timing is not this accurate.  The system does not appear to
. 
> Holly,
. 
> 
> There are two ways to get finer resolution:

	3. Read the countdown register in the timer chip.  This gives ~1 
	microsecond resolution, but only is 16 bits (.055 sec).

-- 
	Dave Mathis, ELAN designs           UUCP  ...oliveb!elandes!dave

nebeker@nprdc.arpa (Del Nebeker) (12/17/88)

 We have been doing psychology experiments with a similar intent for about
three years.  We have been capturing keystroke counts  and times while in
dBase tasks.  We had to write an Assembler program that would monitor the
keyboard and store the count of keystrokes and the elapsed time and then
deliver those register totals to another program after the timing was 
complete.  

We found that on the XT the best resolution possible was based on the clock
tick rate.  That is, any instruction or execution of an interupt was paced
by the pulse of the internal clock rate.  In the XT this rate is 18 times
per second at least as the far as the keyboard is concerned.  That translates
into resolution of .0556 seconds.  Therefore
if the event is occuring in less time than this it will not be timed accurately.

Even if the event is longer than this there will also be some error not to 
exceed .0556 second.

I can't speak  to any other Machines but I'm sure the logic would be the
same.

I hope this is of some help.  If anyone else has a better explanation I'd
like to know about it too.

Del Nebeker
Code 161
Navy Personnel R&D Center
San Diego, CA 92152-6800
(619) 553-7749
nebeker@nprdc.arpa

teittinen@cc.helsinki.fi (12/20/88)

In article <23a8dd15@ralf>, Ralf.Brown@B.GP.CS.CMU.EDU writes:
> 3.  On an AT, enable the real-time clock's 1024/sec interrupt to get
> approximately millisecond resolution.

Questions:
    1. How do you do that? 
    2. Does it affect the speed of software?

-----------------------------------+-------------------------------------------
    EARN: teittinen@finuh          I "Studying is the only way to do nothing
Internet: teittinen@cc.helsinki.fi I  without anyone complaining about it."
-----------------------------------+-------------------------------------------
             Marko Teittinen, student of computer science
-------------------------------------------------------------------------------