dan@mind.UUCP (Dan Kimberg) (04/24/88)
Hi, I have a question about getting accurate event timings on the Amiga. I'm writing an application (in C) for which it is absolutely essential that I be able to time intervals to a certain degree of accuracy. The Libraries and Devices manual seems to suggest that the only way to do this would be to check the system clock before and after the event, and compare the two. My question is the following: if this is the proper way to go about it, how well can I rely on this to get me accurate readings? If there's a better way, what is it? I really do need something at rock bottom on the order of 200 msec accuracy or better, although it's really only necessary that it be consistent (i.e. if it's systematically off by 150 msec due to overhead, I don't need to know). But to what extent can I expect the clock to give me analyzable data? I apologize if this isn't technical enough for .tech, but my messages in the other group get mostly ignored, so I thought this was the best source of info. -Dan (dan@mind.princeton.edu)