srw@cci632.cci.com (Steve Windsor) (04/23/91)
Well, I haven't seen this question before, so I hope someone can help me with it. I am evaluating the video performance of several machines. This includes timing the video painting time for an entire screen of characters, in different resolutions. The problem is, with the clock tick being 18.2 times/second, my granularity is about 55 milliseconds. In the faster machines the paint times are less than this, so i do not get an accurate measurement. In using GetCurrentTime() before and after the paint, the difference will be 0! Is there any way around this? Any ideas would be appreciated. Thanks, stephen windsor srw@op632.cci.com
ebergman@isis.cs.du.edu (Eric Bergman-Terrell) (04/24/91)
If what you're measuring is faster than your clock, do it several times and divide the elapsed time by the number of iterations (unless the loop & procedure overhead is prohibitive)... Terrell