Running Nelly C on Linux - Problem with GetGameSpeed?

Started by Ali, Sat 31/03/2007 17:19:59

Previous topic - Next topic

Ali

Some people have reported a crash when running my game on the Linux AGS engine. The error report is here:

Code: ags
scott@trousersnake:~/Desktop/nelly$ ./ags Nelly\ Cootalot.exe 
Adventure Creator v2.72 Interpreter
Copyright (c) 1999-2001 Chris Jones
ACI version 2.72.920
CD-ROM Audio support enabled.
Pentium Pro CPU detected.
Checking sound inits.
An internal error has occured. Please note down the following information.
If the problem persists, contact Chris Jones.
(ACI version 2.72.920)

Error: run_rep_exec: error -6 (Error: Floating point divide by zero
in Global script (line 468)
) running function 'repeatedly_execute_always'
scott@trousersnake:~/Desktop/nelly$


The offending line of script is this:

Code: ags
float timeStep = 10.0/IntToFloat(GetGameSpeed());


Which works fine on the Windows version. I wondered if there's a problem with GetGameSpeed on the Linux Engine that would result in a divide-by-zero problem.

I'd appreciate any help people could offer on this, I don't know anything about the Linux AGS Engine.

DoorKnobHandle

I'm not sure, but maybe there's a smallish lack of a second because of they way Linux or the emulator or whatnot works, so GetGameSpeed ( ) returns zero.

For the record, when dividing like this and THROUGH a variable that can be zero at some point in time, always check before dividing, like this:

Code: ags

float buffer = IntToFloat ( GetGameSpeed ( ) );

if ( buffer == 0.0 )
   buffer = 0.00000001;

float timeStep = 10.0 / buffer );

Ali

Thanks dkh,

I can see that this code would stop the game crashing, but if GetGameSpeed is returning inaccurate values I worry that the smooth scrolling script wouldn't work properly. Might GetGameSpeed eventually 'catch up' and start returning accurate values?

DoorKnobHandle

I'm really not absolutely sure, but I think that GetGameSpeed ( ) might return zero for just about one or two frames - a really short time for us. In that case, the scrolling might lock up for a millisecond or something, but it would eventually catch up.

As I said, that's only what I'm thinking, CJ would need to clear this up, or you need to try it out yourself (or let somebody else try it out for you with Linux).

Hope this helps.

Kweepa

If you don't change GameSpeed you might as well use 40 or 60 or whatever GetGameSpeed() is supposed to return. I wouldn't recommend dkh's suggestion of 0.00000001 as that would give you an enormous timeStep which might do something crazy to the smooth scrolling.

Unfortunately I don't have a Linux machine here to test out the problem and see if GetGameSpeed() eventually recovers. I use that code fragment quite a lot in my games and modules so it would be good to know. I suspect, like dkh, that it's a problem only on the first frame of the game.
Still waiting for Purity of the Surf II

Ali

Hmm... I don't alter the game speed, so I guess I'll follow your suggestion Steve, and just make it 60.

Thanks for the help, hopefully this will fix my problem. I won't change the thread title to 'solved' though, as we're not absolutely certain what the root of the problem is.

mariano

Why not store the last value of GetGameSpeed when it's not 0 and use it when it's 0

Code: ags
LastGameSpeed = (GetGameSpeed() == 0) ? LastGameSpeed : GetGameSpeed()
float timeStep = 10.0/IntToFloat(LastGameSpeed);

Rui 'Trovatore' Pires

If you don't alter the game speed then make it 40, that's what it defaults to.
Reach for the moon. Even if you miss, you'll land among the stars.

Kneel. Now.

Never throw chicken at a Leprechaun.

GarageGothic

#8
To my knowledge, GetGameSpeed() doesn't represent actual framerate but only the target framerate, which you've selected using SetGameSpeed(). Unless you change it yourself, it will default to 40.

Furthermore, the manual entry for SetGameSpeed(int_newspeed) specifies:

QuoteThe NEW_SPEED must lie between 10 and 1000. If it does not, it will be rounded to 10 or 1000.

In other words, there's no possible way short of an engine error that GetGameSpeed() could return anything below 10. I'm guessing there's some kind of glitch in the Linux engine if that line results in a divide-by-zero.

Edit: Yeah, I just checked a section of my game where the true framerate dips to 7, but GetGameSpeed() remained a constant 40. The only non-engine reason I can think of is whether the System.Vsync property is turned on (either through script or game setup)? That would cap the game speed to the screen's refresh rate, and if the Linux display drivers for some reason returns an invalid Hz value to AGS, I guess that could cause problems.

Ali

Vertical synch is turned on by default in Nelly Cootalot, so that may tie in with what you're saying GG. Nonetheless I'm about to upload a version which doesn't call GetGameSpeed. Hopefully it'll work!

EvilTypeGuy

The Linux version, Windows version, and Mac version all use the exact same code to return the value that GetGameSpeed returns. The basic logic is to return frames per second - game speed modifier.

I doubt this bug is Linux specific. Rather, this is probably something to do with the frames_per_second or game speed modifier.

SMF spam blocked by CleanTalk