Hi,
I'd like to measure the time that elapses within a loop. the loop is executed only once between each rendering by the engine (in other words, it would be executed at each call of repeatedly_execute_always).
So far, let me know if something I said doesn't make sense.
It doesn't matter what the loop does. What's important is that I want to measure how long it takes. And since it's executed at each frame, you can assume that the execution time is relatively small compared to the global time spent on other scripts.
You start to get it: the time I want to measure is very short.
So my question is : What tool would you recommed to measure that? Is a timer accurate enough*? What else would you use?
* If the error of the measure is higher than, let's say, 3 times the time that I want to measure, you'll understand it's not acceptable. However I'd tolerate measures with 50% error (0.5 times the real value) to 100% error (1 time the real value).
EDIT: I'd tolerate a frame rate as low as 15 fps. Which means that I'd spend roughly 50ms for the total script execution at each frame. If I assume the loop I want to measure can take 10% of that time, it means the times I want to measure can be as small as 5ms. Let's say 1ms for better results.
I had an idea, tell me if it sounds crazy :
1. Push the SetGameSpeed to the maximum.
2. Put a "Wait" In the global "repeatedly_execute" to make sure the engine won't try to redraw everything until it explodes. I can choose the value I give to that "Wait" to regulate the actual game speed.
3. Put a counter in the "repeatedly_execute_always". If everything's fine, this counter should tick at the speed defined in the "SetGameSpeed" of step 1.
4. Put the loop I want to measure in "repeatedly_execute"
To measure how long it took to execute the loop, all I have to do is to divide the number of ticks elapsed between 2 "repeatedly_execute" by the speed defined in "SetGameSpeed".
Notes:
a) It won't give me a time in seconds, more of a relative time compared to the overall game speed, but it's OK.
b) This works only if the FPS doesn't go so low that the actual game speed goes completely different from the speed defined in "SetGameSpeed". I need to make sure my treatments are fast enough.
What do you think?
Scorpiorus scripted a FPS counter, it might help you out as he is using RawTime.
http://www.adventuregamestudio.co.uk/yabb/index.php?topic=5388.0
Quote from: Jim Reed on Thu 10/06/2010 13:13:11
Scorpiorus scripted a FPS counter
Unless I think of a clever way to use it, that wouldn't help, because it can only calculate (litterally) the number of frames rendered per seconds, and not the time elapsed individually on each frame (the result of the calculation "1/FPS" would be an average).
If I cannot know the time spent for one frame, I can't know either the time spent on a specific operation within the script during that frame.
A (brutal) workaround would be to perform the treatment I want to measure once every 2 frames, and all the rest of the processings during the remaining frames. It would allow me to roughly measure how much time is spent on my treatment - but until I'm not sure if there's a better solution, I'd rather not do that.
hmm... what if you set the gamespeed to 60 and add a counter. Then you could check if it really took 1 second for the counter to reach 60.
For everything else, I suppose rawtime needs to support microseconds :(
I might have a solution, it's crazy but it might just work. Play a midi and fetch the precise position at the start and end off the routine. That might give you something to calculate. ;D
Alternatively you could also use a MP3 file, I have no idea how accurate it is though.
Quote from: abstauber on Thu 10/06/2010 14:54:28
hmm... what if you set the gamespeed to 60 and add a counter. Then you could check if it really took 1 second for the counter to reach 60.
I'm not sure if you had a good idea, or if it's redundant with the idea I exposed in the post starting with "I had an idea"?
Could you elaborate? Thanks!
EDIT: Our ideas are similar in their underlying mechanism, but the final approach is slightly different :
Both ideas start by overclocking the game engine, however :
1/ You measure the delay that follows the incapacity of the game to meet the requested speed.
2/ I prevent the game from doing anything for "long" periods of time, and then measure how much idle time it had.
Quote from: Wyz on Thu 10/06/2010 15:15:12
Play a midi and fetch the precise position at the start and end off the routine.
That's exactly the kind of powerful workaround I was looking for.
Does anyone have enough experience of the MIDI functions to tell :
1/ If they're fast to call, or if that external call is awfully slow?
2/ If they're accurate, or if they can be significantly late or in advance (since they surely rely on a separate thread) ?
Quote from: Wyz on Thu 10/06/2010 15:15:12
Play a midi and fetch the precise position at the start and end off the routine.
Quote from: abstauber on Thu 10/06/2010 14:54:28
what if you set the gamespeed to 60 and add a counter. Then you could check if it really took 1 second for the counter to reach 60.
Bumping. Any other ideas? If not, I'll consider the case as solved.
In any case, thank you guys, both are very good.
Nope, Wyz idea is wicked :)
Though I'd suggest that you use an ogg with the directx hardware mixer. This should be most accurate.
Quote from: abstauber on Fri 11/06/2010 09:19:44
Though I'd suggest that you use an ogg with the directx hardware mixer. This should be most accurate.
Possible, but wouldn't it use a lot more resources than the MIDI?
That's where I really need the opinion of someone who often uses the music and sound functions (or are you, abstauber?)
Sometimes ;D
http://ags-ssh.blogspot.com/2009/11/while-my-guitar-gently-bleeps.html
I use a midi for timing the notes and check the ogg position to see if they are still in sync.
Quote from: abstauber on Fri 11/06/2010 10:30:12
I use a midi for timing the notes and check the ogg position to see if they are still in sync.
Thanks for your sound advice.
And thanks to Wyz and Jim Reed!
Thanks, good to know this worked out.
I've made a measurement tool real quick that gives you the time between two mouse clicks (mouse up). You can use it to calculate error margins. It should be pretty accurate, but there are issues with dual core processors (affinity).
http://wyz.agser.me/dl/other/measure%20precise%20time.zip
Quote from: Wyz on Fri 11/06/2010 17:09:43
there are issues with dual core processors (affinity).
Could you elaborate?
The application uses the internal counter of the processor which is updated each run. When you have more then one processor (like dual core PCs) the counters of the processors might not be synchronised. Since windows divides the load over the processors the program might be run by either of the two processors. The count will leap forward and backward by the difference is between the cores. There are programs that make an application run on a single core though, and that should fix it. (search for processor affinity)
Another issue is when a processor doesn't have a very stable clock speed. There is not much you can do about that, this will influence all application equally though.