Jibble

Author Topic: Running Nelly C on Linux - Problem with GetGameSpeed?  (Read 2417 times)

Ali

  • What will become of the baron?
    • Ali worked on one or more games that won an AGS Award!
    •  
    • Ali worked on one or more games that was nominated for an AGS Award!
Some people have reported a crash when running my game on the Linux AGS engine. The error report is here:

Code: [Select]
scott@trousersnake:~/Desktop/nelly$ ./ags Nelly\ Cootalot.exe
Adventure Creator v2.72 Interpreter
Copyright (c) 1999-2001 Chris Jones
ACI version 2.72.920
CD-ROM Audio support enabled.
Pentium Pro CPU detected.
Checking sound inits.
An internal error has occured. Please note down the following information.
If the problem persists, contact Chris Jones.
(ACI version 2.72.920)

Error: run_rep_exec: error -6 (Error: Floating point divide by zero
in Global script (line 468)
) running function 'repeatedly_execute_always'
scott@trousersnake:~/Desktop/nelly$

The offending line of script is this:

Code: [Select]
float timeStep = 10.0/IntToFloat(GetGameSpeed());
Which works fine on the Windows version. I wondered if there's a problem with GetGameSpeed on the Linux Engine that would result in a divide-by-zero problem.

I'd appreciate any help people could offer on this, I don't know anything about the Linux AGS Engine.

DoorKnobHandle

  • Mittens Serf
    • DoorKnobHandle worked on one or more games that won an AGS Award!
    •  
    • DoorKnobHandle worked on one or more games that was nominated for an AGS Award!
Re: Running Nelly C on Linux - Problem with GetGameSpeed?
« Reply #1 on: 31 Mar 2007, 17:28 »
I'm not sure, but maybe there's a smallish lack of a second because of they way Linux or the emulator or whatnot works, so GetGameSpeed ( ) returns zero.

For the record, when dividing like this and THROUGH a variable that can be zero at some point in time, always check before dividing, like this:

Code: [Select]
float buffer = IntToFloat ( GetGameSpeed ( ) );

if ( buffer == 0.0 )
   buffer = 0.00000001;

float timeStep = 10.0 / buffer );

Ali

  • What will become of the baron?
    • Ali worked on one or more games that won an AGS Award!
    •  
    • Ali worked on one or more games that was nominated for an AGS Award!
Re: Running Nelly C on Linux - Problem with GetGameSpeed?
« Reply #2 on: 31 Mar 2007, 17:41 »
Thanks dkh,

I can see that this code would stop the game crashing, but if GetGameSpeed is returning inaccurate values I worry that the smooth scrolling script wouldn't work properly. Might GetGameSpeed eventually 'catch up' and start returning accurate values?

DoorKnobHandle

  • Mittens Serf
    • DoorKnobHandle worked on one or more games that won an AGS Award!
    •  
    • DoorKnobHandle worked on one or more games that was nominated for an AGS Award!
Re: Running Nelly C on Linux - Problem with GetGameSpeed?
« Reply #3 on: 31 Mar 2007, 18:04 »
I'm really not absolutely sure, but I think that GetGameSpeed ( ) might return zero for just about one or two frames - a really short time for us. In that case, the scrolling might lock up for a millisecond or something, but it would eventually catch up.

As I said, that's only what I'm thinking, CJ would need to clear this up, or you need to try it out yourself (or let somebody else try it out for you with Linux).

Hope this helps.

Kweepa

  • Mutated Guano Deviser
    • Best Innovation Award Winner 2009, for his modules and plugins
    • Kweepa worked on one or more games that won an AGS Award!
    •  
    • Kweepa worked on one or more games that was nominated for an AGS Award!
Re: Running Nelly C on Linux - Problem with GetGameSpeed?
« Reply #4 on: 31 Mar 2007, 18:12 »
If you don't change GameSpeed you might as well use 40 or 60 or whatever GetGameSpeed() is supposed to return. I wouldn't recommend dkh's suggestion of 0.00000001 as that would give you an enormous timeStep which might do something crazy to the smooth scrolling.

Unfortunately I don't have a Linux machine here to test out the problem and see if GetGameSpeed() eventually recovers. I use that code fragment quite a lot in my games and modules so it would be good to know. I suspect, like dkh, that it's a problem only on the first frame of the game.
Still waiting for Purity of the Surf II

Ali

  • What will become of the baron?
    • Ali worked on one or more games that won an AGS Award!
    •  
    • Ali worked on one or more games that was nominated for an AGS Award!
Re: Running Nelly C on Linux - Problem with GetGameSpeed?
« Reply #5 on: 01 Apr 2007, 12:43 »
Hmm... I don't alter the game speed, so I guess I'll follow your suggestion Steve, and just make it 60.

Thanks for the help, hopefully this will fix my problem. I won't change the thread title to 'solved' though, as we're not absolutely certain what the root of the problem is.

Re: Running Nelly C on Linux - Problem with GetGameSpeed?
« Reply #6 on: 01 Apr 2007, 19:24 »
Why not store the last value of GetGameSpeed when it's not 0 and use it when it's 0

Code: [Select]
LastGameSpeed = (GetGameSpeed() == 0) ? LastGameSpeed : GetGameSpeed()
float timeStep = 10.0/IntToFloat(LastGameSpeed);

Rui 'Trovatore' Pires

  • Lunge da lei per me non v'ha diletto!
    • I can help with AGS tutoring
    • I can help with play testing
    • I can help with proof reading
    • I can help with scripting
    • I can help with story design
    • I can help with translating
    • I can help with voice acting
Re: Running Nelly C on Linux - Problem with GetGameSpeed?
« Reply #7 on: 01 Apr 2007, 23:49 »
If you don't alter the game speed then make it 40, that's what it defaults to.
Reach for the moon. Even if you miss, you'll land among the stars.

Kneel. Now.

Never throw chicken at a Leprechaun.

Re: Running Nelly C on Linux - Problem with GetGameSpeed?
« Reply #8 on: 02 Apr 2007, 01:37 »
To my knowledge, GetGameSpeed() doesn't represent actual framerate but only the target framerate, which you've selected using SetGameSpeed(). Unless you change it yourself, it will default to 40.

Furthermore, the manual entry for SetGameSpeed(int_newspeed) specifies:

Quote
The NEW_SPEED must lie between 10 and 1000. If it does not, it will be rounded to 10 or 1000.

In other words, there's no possible way short of an engine error that GetGameSpeed() could return anything below 10. I'm guessing there's some kind of glitch in the Linux engine if that line results in a divide-by-zero.

Edit: Yeah, I just checked a section of my game where the true framerate dips to 7, but GetGameSpeed() remained a constant 40. The only non-engine reason I can think of is whether the System.Vsync property is turned on (either through script or game setup)? That would cap the game speed to the screen's refresh rate, and if the Linux display drivers for some reason returns an invalid Hz value to AGS, I guess that could cause problems.
« Last Edit: 02 Apr 2007, 10:37 by GarageGothic »

Ali

  • What will become of the baron?
    • Ali worked on one or more games that won an AGS Award!
    •  
    • Ali worked on one or more games that was nominated for an AGS Award!
Re: Running Nelly C on Linux - Problem with GetGameSpeed?
« Reply #9 on: 02 Apr 2007, 15:16 »
Vertical synch is turned on by default in Nelly Cootalot, so that may tie in with what you're saying GG. Nonetheless I'm about to upload a version which doesn't call GetGameSpeed. Hopefully it'll work!

EvilTypeGuy

  • Freaky Penguin Dude
Re: Running Nelly C on Linux - Problem with GetGameSpeed?
« Reply #10 on: 02 Apr 2007, 23:46 »
The Linux version, Windows version, and Mac version all use the exact same code to return the value that GetGameSpeed returns. The basic logic is to return frames per second - game speed modifier.

I doubt this bug is Linux specific. Rather, this is probably something to do with the frames_per_second or game speed modifier.