Milisecond wait suggestion

Started by Reko, Wed 13/10/2004 23:21:34

Previous topic - Next topic

Reko

Hey.

How about a function that acts similarly to Wait(), except it takes miliseconds as a value? I find it more comfortable to work in miliseconds since it's more precice and easier to work with than the 40th system used by Wait().

e.g.:
waitms(2000); would wait 2 seconds, 500 would be half a second, 1000 a second, etc.

Of course, depending on what sort of timing system AGS uses, this might not be possible.Ã,  :-\

Scummbuddy

i believe you could create your own function to do so. but you could use Wait(1); and you can set your game loop speed accordingly to make it so that it works in a millisecond timeframe.
- Oh great, I'm stuck in colonial times, tentacles are taking over the world, and now the toilets backing up.
- No, I mean it's really STUCK. Like adventure-game stuck.
-Hoagie from DOTT

strazer

#2
function WaitMS(int milliseconds) {
  int gameloops = (GetGameSpeed() * milliseconds) / 1000;
  if (gameloops == 0) gameloops = 1;
  Wait(gameloops);
}

(Edited to avoid Wait(0))

Goot

The purpose of the Wait function being in loops is so you can easily make an option to change the game speed, but changing the number of loops per second.

In strazer's function will the /100 round or be a decimal? If you have something like Wait(56.128); it's going to give you an error.

If you do end up needing to round it you can use this (it will always round down):

int round;

round=0;
round=int; //int is the number you are rounding
while(int>=1){
int-=1;
round+=1;
}
int=round;

unless I typed something wrong that will work. I've tested it.

strazer

AGS doesn't support decimals, so integers will automatically round up/down.

Radiant

Specifically, down.

I seem to recall that negative numbers are also rounded down in AGS (as in, rounded to further negative numbers) as opposed to in C where they are rounded towards zero - but that's not particularly important I suppose.

TerranRich

strazer: The question being asked was, can he have AGS wait less than 1 game cycle. To which the answer is no. Just alter the game's frame/cycle rate to 1000. This way, 1 game cycle = 1 millisecond.
Status: Trying to come up with some ideas...

strazer

Well, in his question he said it would be more comfortable and easier to work with, so I guess my suggestion is valid.

Radiant

...except that the game's framespeed is limited by your computer's speed. For instance my computer won't go above 60 or so (if I set it to 80, it will behave as 60 since my computer isn't any faster than that). I doubt even top-of-the-line PCs would reliably hit 1000.
If you're willing to go for centiseconds, you'd probably get it to work.

Which leaves the question of what exactly you wanted it for, so you could get some more specific help.

fovmester

QuoteThe question being asked was, can he have AGS wait less than 1 game cycle. To which the answer is no. Just alter the game's frame/cycle rate to 1000. This way, 1 game cycle = 1 millisecond.

Why would you want to wait 1 millisecond? Nobody will ever notice it, anyway. If your game runs in 40 cycles/sek (FPS 40) then Wait(1) will wait for 1/40 second which is less time than anyone could ever notice anyway. Any smaller delay wouldn't matter.

Rui 'Trovatore' Pires

Sorry for bringing this thread back, but I thought of a possible use for this "milisecond Wait".

Fading is very neat, and Wait allows us to do it inside a "while" event, or even "if" inside repeatedly_execute. But sometimes Wait(1) is still not fast enough, and speeing it up by, say, changing the object's transparency by 5 instead of 2 each time can make for some jerky fading.
Reach for the moon. Even if you miss, you'll land among the stars.

Kneel. Now.

Never throw chicken at a Leprechaun.

Janik

#11
I have an idea that might work as long as timing isn't critical (it wouldn't be very precise). How about running a calibration once, on game_start, something like (I don't remember the syntax for do while loops in c...) :

//Start counting at the beginning of a second
start = GetTime(3);
do
Ã,  now =Ã,  GetTime(3);
loop until now == start + 1;

//Count how many loops needed
start = GetTime(3);
count = 0;
do
count++;
Ã,  now =Ã,  GetTime(3);
loop until now == start + 1;

And then there are count cycles per second.

Then, do

for (i = 0; i < Milliseconds*count/1000; i++)
{
Ã,  now =Ã,  GetTime(3);
}

And that should be the right delay (haven't tested). Counting for 1 game cycle instead might be nicer since it wouldn't involve a 1 sec delay each time the game starts.
Play pen and paper D&D? Then try DM Genie - software for Dungeons and Dragons!

Hellomoto

The syntax for a C do while loop is

do{
...
} while(...)

if you read the code which is above, insert a bracket after each 'do' and change 'loop until' to the '} while' putting the comparison which does follow, inside paranthesie.

Could you not access the clock which is on your system, using a c function, I do not know if AGS would support such a function, the milliseconds on your system are measured from a date at some point around 1977, although I am not sure. If you could acces this number at the point from which you wished to begin the count, and continuously access it, until a test against the figure to which you wished to measure, you could, fairly accurately measure a millisecond. However, this would be very system intensive, as the test would need to be carried out at a very rapid rate.

The issue does arise again, of the fact that this would again depend entirely of the frame rate of your game. I cannot post code for this, as do not know f the functions required are available.

Janik

As far as I can tell AGS can only get the system time down to the closest second, using GetTime.

My method shouldn't be framerate dependant, since, I believe but I'm not certain, script execution is blocking. So nothing else (in the game) would slow the calculation. If a background program is running and using up the CPU, then it would indeed throw off the calculation.

I just thought of something however - in the case of a gradual fading, this wouldn't work, since the screen still only displays at the rate set by SetGameSpeed (e.g. 40 times per second). So, no, there's no point in delaying less than 1 game cycle.
Play pen and paper D&D? Then try DM Genie - software for Dungeons and Dragons!

fovmester

Quote from: redruM on Tue 26/10/2004 21:51:52
Sorry for bringing this thread back, but I thought of a possible use for this "milisecond Wait".

Fading is very neat, and Wait allows us to do it inside a "while" event, or even "if" inside repeatedly_execute. But sometimes Wait(1) is still not fast enough, and speeing it up by, say, changing the object's transparency by 5 instead of 2 each time can make for some jerky fading.

Actually Wait(1) waits exactly one gameloop, which means one screen-update. If you'd change the object's transparency faster than this (say five times per screen update) it'd still look the same as changing it by 5 once and then Wait(1). So there's actually no way you could do anything smoother than the updatefrequency lets you.

This is why modern games want as high a FPS (frames per sec) as possible. A normal update frequency in AGS is 40 FPS, but often in 3D-shooters you'd want it to be at least 75 and often higher for the game to feel smooth. So to have your AGS-game run smoother increase the update-frequency (the computer will have to be up to the task though).

Rui 'Trovatore' Pires

Reach for the moon. Even if you miss, you'll land among the stars.

Kneel. Now.

Never throw chicken at a Leprechaun.

SMF spam blocked by CleanTalk