[comp.os.msdos.programmer] How can I get a timed delay of 1 millisecond ?

elee24@castle.ed.ac.uk (H Bruce) (09/11/90)

What is the best way to get a program to wait for one milliosecond ?
(I am using Microsoft C V5.1).

Thanks,

Henry Bruce.

bcw@rti.rti.org (Bruce Wright) (09/12/90)

In article <6250@castle.ed.ac.uk>, elee24@castle.ed.ac.uk (H Bruce) writes: 
> What is the best way to get a program to wait for one milliosecond ?
> (I am using Microsoft C V5.1).

The usual way to get a timed delay of one millisecond on the
PC is to time a loop that't long enough to take a significant
amount of time even on a 486, and then scale that appropriately
to execute the loop the proper number of times for your PC.  This
can allow the delay to be pretty nearly one millisecond, without the
usual truncation problems you would have with a one millisecond
timer interrupt, which could have an error of up to one millisecond
if it's implemented as the usual multi-shot system timer interrupt.

The problem with this is that you generally can't do anything
useful during that loop (otherwise it might take different
amounts of time ...).  Getting around this problem requires that
you have a 1 millisecond (or smaller) timer, which makes you more
hardware-specific than the above method.  You can either put a
third-party timer on the machine (often sold in conjunction with
A-D boards), or you can reprogram the 8253 timer if your machine
is a "true" compatible (which most are these days, but it depends
on how paranoid you need to be for this application).  The problem
with reprogramming the 8253 (besides hardware compatibility issues)
is that you will then need to steal the timer interrupt from DOS,
and if you want the DOS time to remain accurate you will need to
provide DOS with a simulated interrupt every once in a while;  
then of course you will probably want to give the interrupt back
to DOS when you're through (if you don't and exit anyway you will
crash ... though if you are running a real-time process you may
not ever want to exit).

If you want further details on reprogramming the 8253 I can dig
up the information, but I don't have anything _quite_ like what
you want lying around & it has enough hardware-specific code that 
I can't just write it off the top of my head.

Hope this helps a bit ...

						Bruce C. Wright

ekalenda@cup.portal.com (Edward John Kalenda) (09/12/90)

> What is the best way to get a program to wait for one milliosecond ?
> (I am using Microsoft C V5.1).

If you really need that fine a timer, there is an unused timer channel
in the timer chip. program it to start counting down at zero, then read
the current counter value until it reaches 65535 - ticks where each tick
is 1/1193100 seconds.

If you need a code fragment, send me e-mail and I'll try to put one
together for you.

Ed
ekalenda@cup.portal.com

Ralf.Brown@B.GP.CS.CMU.EDU (09/12/90)

In article <4064@rtifs1.UUCP>, bcw@rti.rti.org (Bruce Wright) wrote:
}In article <6250@castle.ed.ac.uk>, elee24@castle.ed.ac.uk (H Bruce) writes: 
}> What is the best way to get a program to wait for one milliosecond ?
}> (I am using Microsoft C V5.1).
}
}The usual way to get a timed delay of one millisecond on the
}PC is to time a loop that't long enough to take a significant
}amount of time even on a 486, and then scale that appropriately
}to execute the loop the proper number of times for your PC.  This

On ATs and up, you can program the real-time clock chip to generate
interrupts 1024 times per second, which will get you close to 1ms delays.
--
UUCP: {ucbvax,harvard}!cs.cmu.edu!ralf -=- 412-268-3053 (school) -=- FAX: ask
ARPA: ralf@cs.cmu.edu  BIT: ralf%cs.cmu.edu@CMUCCVMA  FIDO: 1:129/3.1
Disclaimer?    |   I was gratified to be able to answer promptly, and I did.
What's that?   |   I said I didn't know.  --Mark Twain

bcw@rti.rti.org (Bruce Wright) (09/14/90)

In article <26ee2f32@ralf>, Ralf.Brown@B.GP.CS.CMU.EDU writes:
> In article <4064@rtifs1.UUCP>, bcw@rti.rti.org (Bruce Wright) wrote:
> }In article <6250@castle.ed.ac.uk>, elee24@castle.ed.ac.uk (H Bruce) writes: 
> }> What is the best way to get a program to wait for one milliosecond ?
> }> (I am using Microsoft C V5.1).
> }
> }The usual way to get a timed delay of one millisecond on the
> }PC is to time a loop that't long enough to take a significant
> }amount of time even on a 486, and then scale that appropriately
> }to execute the loop the proper number of times for your PC.  This
> 
> On ATs and up, you can program the real-time clock chip to generate
> interrupts 1024 times per second, which will get you close to 1ms delays.

Yes, of course.  I _thought_ that was what the entire remainder of
my original article was about;  I may not have been quite clear
enough on what I was talking about.

The main problem with this is with the resolution, depending on how
the timer is programmed.  If you _really_ need a delay of exactly 1 ms,
as might be the case for process control applications, you may find
the facilities in the PC clock chip rather limited (the straight-
forward implementation of a clock that increments/decrements a counter
every time it gets a 1 Hz interrupt is going to result in an error
of up to 1 ms, not counting interrupt latency caused by any other
interrupts that are enabled).  Depending on your application it
_may_ be possible to program the clock with a count for an exact
delta time for an interrupt, but this assumes that you only ever
have one clock running at a time (can't keep track of time very
well except for that one delay).

The best method is going to depend on why the programmer wants a
delay in the first place ...

						Bruce C. Wright

darcy@druid.uucp (D'Arcy J.M. Cain) (09/14/90)

In article <6250@castle.ed.ac.uk> elee24@castle.ed.ac.uk (H Bruce) writes:
>What is the best way to get a program to wait for one milliosecond ?
>(I am using Microsoft C V5.1).
>
Unfortunately the best granularity is about 10/182 of a second since the
system clock ticks at about 18.2 times per second.  It's even worse because
you could be of by almost 10/182 of a second because you don't know if you
are sampling at the start or the end of a tick.  If you know what the speed
of your CPU is you can perform instructions that take the required amount of
time to complete.  There are two disadvantages to this.  You have to resort
to assembler which may be a problem for some, and more importantly, events
asynchronous to your program (interupts) may distort the timing.

If you can live with about 1/10 sec granularity here is a simple timing
construct.  First set up a pointer to a far time_t (long if your compiler
doesn't understand time_t) and initialize it to location 0x046c.  This is
a memory location which is updated at every clock tick.  Then read the value
at that location and add the wait value and keep checking the location.  Here
is a simple function to do this.

void	pause(int tenths)
{
	time_t	far *cur_time = (time_t far *)(0x046c);
	time_t	end_time = *cur_time;

	/* note that we get the current time as soon as possible in the fewest */
	/* number of instructions in order to get the most accurate start time */
	/* now we will add the number of ticks to get the real end time */
	end_time += (tenths * 182)/100;	/* each tenths is 1.82 ticks */

	while (end_time > *cur_time)	/* wait till time is up.
		;
}

You can get fancier if you want but this is the basic idea.  Note that
this routine will fail at midnight when the clock rolls over but that's
a really small window.  If it is really important you can modify the code
to account for this.

-- 
D'Arcy J.M. Cain (darcy@druid)     |
D'Arcy Cain Consulting             |   MS-DOS:  The Andrew Dice Clay
West Hill, Ontario, Canada         |   of operating systems.
+ 416 281 6094                     |

phys169@canterbury.ac.nz (09/19/90)

In article <4064@rtifs1.UUCP>, bcw@rti.rti.org (Bruce Wright) writes:
> In article <6250@castle.ed.ac.uk>, elee24@castle.ed.ac.uk (H Bruce) writes: 
>> What is the best way to get a program to wait for one milliosecond ?
>> (I am using Microsoft C V5.1).
> 
>... you can reprogram the 8253 timer if your machine
> is a "true" compatible (which most are these days, but it depends
> on how paranoid you need to be for this application).  The problem
> with reprogramming the 8253 (besides hardware compatibility issues)
> is that you will then need to steal the timer interrupt from DOS,
> and if you want the DOS time to remain accurate you will need to
> provide DOS with a simulated interrupt every once in a while...

There is a *MUCH* better way than re-programming that chip, and it is easier
than using the P.I.T. chip that simple PC's don't have. It is this:

Read the current count off the 8253 timer channel 0, the one normally used to
provide the 18.2Hz (approx) timer tick for the system. Don't reprogram it or
intercept the interrupt, simply keep looking at the current value (read port
040hex twice in a loop, LSB first, and wait for it to change 1mS/0.001193180
times - it doesn't upset anything else, is easy to program, and unlikely to be
affected by anything else people may do to the chip (e.g. BASIC's
reprogrammming of it). The only thing to worry about is the interval being too
large because of interrupt service routines (including the system clock handler
itself) taking time from your program... you could do a bit of arithmetic to
calculate the counter value to wait for (but do a <= test, not =), and allow
for the count wrapping when it decrements to zero, and read both successive
bytes with interrupts turned off.

The accuracy of the method is theoretically about 1 microsecond, but the CPU
overheads starting and stopping may make this about 40uS on a 10Mh AT in a HLL.

example code: {Turbo Pascal, from memory (so E&OE)}

procedure WaitAMillisecond;
var LastCount,
    PresentCount : word;
    CountsToGo   : integer;
begin
DisableInterrupts;
PresentCount:=Port[$40]+256*Port[$40];
EnableInterrupts;
CountsToGo:=round(1.0/1.193180e3); {1.0 millisec; can be up to 27 mS}
repeat LastCount:=PresentCount;
       DisableInterrupts;
       PresentCount:=Port[$40]+256*Port[$40];
       EnableInterrupts;
       if LastCount>=PresentCount
          then CountsToGo:=CountsToGo-(LastCount-PresentCount)
          else CountsToGo:=CountsToGo-(65536-PresentCount+LastCount)
       until CountsToGo<=0;
end;

Hope this helps,
Mark Aitchison, Physics, University of Canterbury, New Zealand.