[comp.windows.ms.programmer] Profiler - what the heck is this?

mmshah@athena.mit.edu (Milan M Shah) (02/13/91)

Hello.

I am trying to estimate the running time of a piece of code I wrote, and
surprizingly, am at a loss of tools.

To start with, I read about the windows profiler, and said: how nice. Well, 
I tried to run it in enh mode as per directions and didn't come across any
operational problems. However, when I look at the output using showhits, it
gives me what seems to be semantic garbage. About 86% of the 'hits' occur in
KERNL386 - Fatal exit and OutputDebugString (note that my app does not die a
terrible death or anything, with or without the profiler stuff). So exactly
what debug string windows is trying to generate and why it takes so long is not
clear (I even have a second mono monitor and ox.sys, and nothing appears on
the mono). So, does anyone know how to use this profiler tool?

Well, I tried to resort to less exotic tools. I do a simple call to clock()
before and after the piece of code is called. Well, for my code, the difference
is 50 (ie, 50 processor clock ticks). Assuming my 386 is the special made to
order RISC version and it achieves 1 instruction / clock, this information 
tells me that I managed to make three function calls to bitblt, each time
blitting a 100x60 bitmap all in 50 instructions (I would think that just 
setting up the parameters and executing the rets would take up these many, yes?

Instead of clock(), I tried to use GetCurrentTime(), an SDK function. Now, the
difference is consistently 55, (ie, my bitblts are costing me 55*20MHz = 
1.1 million clock ticks). I hope this is not true.

So, can anyone suggest how best to measure the running cost of a piece of code?

Thank you in advance!

Milan
.