#define vs int

Started by Retro Wolf, Tue 05/04/2016 18:27:59

Previous topic - Next topic

Retro Wolf

Code: ags

#define myNumber 1
int myNumber 1;


I'm interested to know what the differences are. When I use the first one; In my code I'm treating it as a constant, to avoid magic numbers. Is that the correct use? Are there any advantages (such as memory usage)?

Khris

#1
#define will basically make the compiler do a Search & Replace before the compilation, afaik. Which means the memory used for a #define is essentially 0.

If you tried to actually use that snippet, the second line would turn into
Code: ags
int 1 1;   // or: int 1 = 1;
and cause a compilation error.


Retro Wolf

Cheers for the reply Khris!

SMF spam blocked by CleanTalk