OverviewFollowing a recent request (see issue 2705 (https://github.com/adventuregamestudio/ags/issues/2705)),
I started working on an experimental feature which would allow to attach custom pixel shaders to game objects in AGS 4.0.
THIS POST WAS UPDATED TO COVER THE LATEST STATEThe two PRs with shaders support was merged into AGS 4 branch:
https://github.com/adventuregamestudio/ags/pull/2716
https://github.com/adventuregamestudio/ags/pull/2733
This feature is now a part of AGS 4 since Alpha 22 update. Please see the AGS 4 release thread for download links:
https://www.adventuregamestudio.co.uk/forums/ags-engine-editor-releases/ags-4-0-early-alpha-for-public-test/
Demo game project:
https://www.dropbox.com/scl/fi/q8lwoi6xwg4m7apezs6i8/ags4-shaders.zip?rlkey=12d4gxmfln1dh4h1zh0e2vpfs&st=9wdbt6e4&dl=0
Compiled game:
https://www.dropbox.com/scl/fi/yj98k9o43tiuufbo180cq/ags4-shaders-game.zip?rlkey=yspxgpfzduspun7o6a6s8kq4l&st=xuewfvb4&dl=0
Example preview:
Spoiler
(https://i.imgur.com/wJuzeLR.gif)
Explanation and instructionsThe idea overall is this:
1. User writes custom shader scriptsShaders are written in either GLSL (OpenGL shader language) or HLSL (Microsoft's shader language for Direct3D). These scripts are distributed along with the game, either as external files, or packaged using "Package custom folders" option in General Settings.
OpenGL graphics driver uses GLSL scripts and compiles them into shaders at runtime.
Direct3D graphics driver uses HLSL scripts and compiles them into shaders at runtime.
Optionally Direct3D can also load up precompiled shader objects (FXO). These may be created using Microsoft's "fxc" utility ("Effect-Compiler") from the older DirectX SDK (https://www.microsoft.com/en-us/download/details.aspx?=&id=6812) (deprecated and archived), or possibly a modern equivalent "dxc" ("directx compiler") (https://github.com/microsoft/DirectXShaderCompiler/releases). Frankly, I haven't tested the latter yet, so not fully sure if it supports old HSLS dialect for Direct3D 9, which is used by AGS.
2. In script there are two new structs declared: ShaderProgram and ShaderInstance.ShaderProgram represents a compiled shader, while ShaderInstance represents a shader setup: which is a shader + a set of custom shader values.
For those who have an idea of how modern game engines work: you may think of ShaderInstance as a kind of a very limited "Material" type.
ShaderProgram is used to create ShaderInstances, and ShaderInstances are assigned to the game objects.
Following may have a shader assigned:
* Screen.Shader
* Viewport.Shader (e.g. Screen.Viewport.Shader)
* Camera.Shader (e.g. Game.Camera.Shader)
* Object.Shader, Character.Shader, GUI.Shader, GUIControl.Shader, Overlay.Shader
* Room.BackgroundShader
* Mouse.CursorShader
The new structs are declared like:
builtin managed struct ShaderProgram {
/// Creates a new ShaderProgram by either loading a precompiled shader, or reading source code and compiling one.
import static ShaderProgram* CreateFromFile(const string filename); // $AUTOCOMPLETESTATICONLY$
/// Creates a new shader instance of this shader program.
import ShaderInstance* CreateInstance();
/// Gets the default shader instance of this shader program.
import readonly attribute ShaderInstance* Default;
};
builtin managed struct ShaderInstance {
/// Sets a shader's constant value as 1 float
import void SetConstantF(const string name, float value);
/// Sets a shader's constant value as 2 floats
import void SetConstantF2(const string name, float x, float y);
/// Sets a shader's constant value as 3 floats
import void SetConstantF3(const string name, float x, float y, float z);
/// Sets a shader's constant value as 4 floats
import void SetConstantF4(const string name, float x, float y, float z, float w);
/// Sets a secondary shader's input texture, using a sprite number. Only indexes 1-3 are supported.
import void SetTexture(int index, int sprite);
/// Gets this instance's parent Shader
import readonly attribute ShaderProgram* Shader;
};
You create a ShaderProgram using CreateFromFile static method. The filename you pass may contain a glsl, hlsl or fxo extension, but the engine will
choose an actual file depending on the current graphics driver. For example, you call CreateFromFile("shaders/myshader.glsl"). If you run with OpenGL engine will look for "myshader.glsl". If you run with Direct3D engine will look for "myshader.fxo" (precompiled directx shader), and if it's not present, then for "myshader.hlsl" (directx shader source). This goes vice versa too. Such approach lets user pass the filename in any format, and not make extra if/else conditions in script checking current gfx driver (although you may if you need to).
ShaderProgram object is created
always, even if shader compilation is failed for any reason. This is essential, because graphic driver may not support this shader, or not support shaders at all (such as Software graphics driver). This lets you write scripts not worrying about things failing if player switches to another driver. Of course the real visual effect will only appear if the actual shader was initialized successfully; otherwise this shader program will just do nothing.
After ShaderProgram is created, you have two options:
* use ShaderProgram.Default property which returns a always present default ShaderInstance. This instance is there to simplify things for you.
* create more ShaderInstances using ShaderProgram.CreateInstance() method.
Simple example would be like:
ShaderProgram* myshader;
function game_start()
{
myshader = ShaderProgram.CreateFromFile("$DATA$/shaders/myshader.glsl");
Screen.Shader = myshader.Default;
}
Why create more instances? Shaders may have "constants" in them, which may be thought as shader settings. These "constants" are not really constants in general sense, they are called "constants" because they don't change while shader is used in drawing. But you may change their values between the draws.
There are alot of purposes to use constants. Just to give couple of examples:
* a shader that tints a sprite by adding certain color - may have a constant "color", which you configure in script.
* a shader that changes the sprite look depending on time - then it will have a constant "current time", which engine will update each frame (find more info below).
So, suppose you have one shader, but want to configure this shader differently for different objects. That's where you need separate ShaderInstances.
You create 5 shader instances, and assign these to 5 objects, then set different constant values for these separate instances.
Here's an example:
ShaderProgram* myshader;
ShaderInstance* myshaderInsts[5];
function game_start()
{
myshader = ShaderProgram.CreateFromFile("$DATA$/shaders/myshader.glsl");
for (int i = 0; i < 5; i++)
{
myshaderInsts[i] = myshader.CreateInstance();
object[i].Shader = myshaderInsts[i];
}
myshaderInsts[0].SetConstantF3("Color", 1.0, 0.0, 0.0); // red
myshaderInsts[1].SetConstantF3("Color", 0.0, 1.0, 0.0); // green
myshaderInsts[2].SetConstantF3("Color", 0.0, 0.0, 1.0); // blue
// and so on
}
How to write shadersIn very primitive way "pixel shader" is an algorithm that is run over each pixel of a sprite, receives a real pixel color as an input, and returns same or another pixel color as an output. So what it does, essentially, is changing sprite pixel colors in some way. This change is not permanent (the sprite remains) but the result of a shader is used to draw this sprite on screen.
Unfortunately, I do not have enough expertise nor spare time to explain shader scripts from ground up. But AGS uses standard shader languages, and there must be thousands of tutorials online.
There are however few things that I must mention.
When writing shaders you may have your custom constants in it, but are also allowed to use a number of "standard" constants provided by the engine. The engine sets their values automatically for each draw so you don't have to do that yourself (in fact, if you do then these values will get overridden by the engine).
In GLSL these must
match the name and type, but their order is not important (they may be not present if not used too). That's because OpenGL finds these by name.
In HLSL these constants
EITHER must
match the type and the *register number*,
OR you have to write an accompanying ini file called "<shadername>.d3ddef", which describes your shader. That's because Direct3D 9 needs a different utility library for finding constant automatically, but this library is outdated, so I decided to not use it to be safe. I will explain "d3ddef" file a little further.
Following is the list of primary standard constants:
* float iTime - current time in seconds; note that it's not exact values that should matter, but the fact they it changes over time
* int iGameFrame - current game frame index (NOTE: must be float in HLSL, because apparently it D3D9-compatible mode does not support integer constants? at least that's what I read somewhere);
* sampler2D iTexture - sprite's texture reference;
* vec2 iTextureDim - texture's dimensions in pixels (type `float2` in HLSL);
* float iAlpha - sprite's general alpha (object's transparency);
* vec2 iOutputDim - final dimensions in pixels (type `float2` in HLSL); this constant is only set for the "whole screen shader" and tells the real resolution that the image will have when appear in window. If this shader will be applied to other game object, then this constant will have value of zero.
These are standard constants used to attach more textures to the shader:
* sampler2D iTexture0 to iTexture3 (NOTE: iTexture0 is an alias to iTexture, and either of these two may be used);
* vec2 iTextureDim0 to iTextureDim3 (NOTE: iTextureDim0 is an alias to iTextureDim).
Other predefined input parameters:
* vec2 vTexCoord - is predefined for GLSL only, gets current texture coordinate for the pixel. HLSL should use TEXCOORD0 input parameter (see shader examples below).
Example of declaring parameters in GLSL:
uniform float iTime;
uniform int iGameFrame;
uniform sampler2D iTexture;
uniform vec2 iTextureDim;
uniform float iAlpha;
varying vec2 vTexCoord;
Example of declaring parameters in HLSL (notice the order of registers! - that matters if you precompile the shader):
// Pixel shader input structure
struct PS_INPUT
{
float2 Texture : TEXCOORD0;
};
// Pixel shader output structure
struct PS_OUTPUT
{
float4 Color : COLOR0;
};
sampler2D iTexture; // is in sampler register 0
const float iTime: register( c0 );
const float iGameFrame: register( c1 );
const float2 iTextureDim: register( c2 );
const float iAlpha: register( c3 );
Example of my "Colorwave" shader in GLSL (for OpenGL):
Spoiler
uniform sampler2D iTexture;
uniform float iTime;
uniform int iGameFrame;
uniform vec2 iTextureDim;
uniform float iAlpha;
varying vec2 vTexCoord;
#define PI 3.1415
#define PI2 6.283
#define FPS 120.0
#define WAVE_DIR -1.0
#define TINT_STRENGTH 0.2
#define X_OFFSET_STRENGTH 0.00
#define Y_OFFSET_STRENGTH 0.02
void main()
{
vec2 uv = vTexCoord;
// convert from textcoord [-1;1] to [0;1] range
vec2 uv_1 = uv * 0.5 + 0.5;
// timer goes [0 -> 1) and resets, in FPS frequency
float timer = mod(iGameFrame, FPS) / FPS;
// wave cycles by timer + add starting phase depending on texture pixel position
float wave_x = sin((WAVE_DIR * PI2 * timer) + (PI2 * uv_1.x));
float wave_y = sin((WAVE_DIR * PI2 * timer) + (PI2 * uv_1.y));
float wave_z = sin((WAVE_DIR * PI2 * timer) + (PI * uv_1.x));
vec3 tint = vec3(TINT_STRENGTH * wave_x, TINT_STRENGTH * wave_y, TINT_STRENGTH * wave_z);
vec4 color = texture2D(iTexture, uv + vec2(wave_x * X_OFFSET_STRENGTH, wave_y * Y_OFFSET_STRENGTH));
gl_FragColor = vec4(color.xyz + tint, color.w);
}
and same shader in HLSL (for Direct3D):
Spoiler
// Pixel shader input structure
struct PS_INPUT
{
float2 Texture : TEXCOORD0;
};
// Pixel shader output structure
struct PS_OUTPUT
{
float4 Color : COLOR0;
};
// Global variables
sampler2D iTexture;
const float iTime: register( c0 );
const float iGameFrame: register( c1 );
const float2 iTextureDim: register( c2 );
const float iAlpha: register( c3 );
#define PI 3.1415
#define PI2 6.283
#define FPS 120.0
#define WAVE_DIR -1.0
#define TINT_STRENGTH 0.2
#define X_OFFSET_STRENGTH 0.00
#define Y_OFFSET_STRENGTH 0.02
PS_OUTPUT main( in PS_INPUT In )
{
float2 uv = In.Texture;
// convert from textcoord [-1;1] to [0;1] range
float2 uv_1 = uv * 0.5 + 0.5;
// timer goes [0 -> 1) and resets, in FPS frequency
float timer = fmod(iGameFrame, FPS) / FPS;
// wave cycles by timer + add starting phase depending on texture pixel position
float wave_x = sin((WAVE_DIR * PI2 * timer) + (PI2 * uv_1.x));
float wave_y = sin((WAVE_DIR * PI2 * timer) + (PI2 * uv_1.y));
float wave_z = sin((WAVE_DIR * PI2 * timer) + (PI * uv_1.x));
float3 tint = float3(TINT_STRENGTH * wave_x, TINT_STRENGTH * wave_y, TINT_STRENGTH * wave_z);
float4 color = tex2D(iTexture, uv + float2(wave_x * X_OFFSET_STRENGTH, wave_y * Y_OFFSET_STRENGTH));
PS_OUTPUT Out;
Out.Color = float4(color.xyz + tint, color.w);
return Out;
}
Back to the "d3ddef" file required for the Direct3D's HLSL shaders. This file is obligatory if you have custom constants, as Direct3D cannot know about these. Another reason to write one if you like to specify compiler target (HLSL version).
"d3ddef" file is a simply "ini" file, which may contain few options:
[compiler]
target = compilation target ("ps_2_0", and so on)
entry = entry function name (e.g. "main")
[constants]
<constant_name> = register index (a number >= 0)
If "compiler" options are not present, Direct3D will use defaults.
If "constants" are not present, Direct3D will use default hardcoded register values. Note that if you write "[constants]" section in it, then you MUST mention ALL constants, including standard ones.
This may be an example of "colorwave.d3ddef" for my demo shader (I don't really use it in the demo game, but could have):
[compiler]
target = ps_2_b
[constants]
iGameFrame = 1
iTextureDim = 2
iAlpha = 3
An update, I've revamped script API, for the purpose of supporting custom shader parameters, and now it looks like this:
builtin managed struct ShaderProgram {
/// Creates a new ShaderProgram by either loading a precompiled shader, or reading source code and compiling one.
import static ShaderProgram* CreateFromFile(const string filename); // $AUTOCOMPLETESTATICONLY$
/// Creates a new shader instance of this shader program.
import ShaderInstance* CreateInstance();
/// Gets the default shader instance of this shader program.
import readonly attribute ShaderInstance* Default;
};
builtin managed struct ShaderInstance {
/// Sets a shader's constant value as 1 float
import void SetConstantF(const string name, float value);
/// Sets a shader's constant value as 2 floats
import void SetConstantF2(const string name, float x, float y);
/// Sets a shader's constant value as 3 floats
import void SetConstantF3(const string name, float x, float y, float z);
/// Sets a shader's constant value as 4 floats
import void SetConstantF4(const string name, float x, float y, float z, float w);
};
Here ShaderProgram represents a compiled shader itself, and ShaderInstance is shader's setup with certain constant values. You may think of ShaderInstance as a kind of a limited "material" type.
Game objects (characters, etc) now assign ShaderInstance pointer to themselves, rather than a numeric "shader id".
Each ShaderProgram has a "default" instance which is always present, and may be used when either this shader does not have custom parameters, or you don't want to set them up. On another hand, if you want to use same shader on multiple objects but with separate sets of parameters, then you can create more "ShaderInstances".
ShaderInstance may be assigned to multiple objects, in which case they all will share same shader setup.
For example:
player.Shader = myShaderProgram.Default;
cCharacter1.Shader = myShaderProgram.CreateInstance();
cCharacter2.Shader = myShaderProgram.CreateInstance();
cCharacter3.Shader = myShaderProgram.CreateInstance();
cCharacter1.Shader.SetConstantF("CustomConstant", 1.0);
cCharacter2.Shader.SetConstantF("CustomConstant", 2.0);
cCharacter3.Shader.SetConstantF("CustomConstant", 5.0);
More details are in the updated PR post:
https://github.com/adventuregamestudio/ags/pull/2716#issue-2987287896
Downloaded experimental build here:
https://cirrus-ci.com/task/6359903394070528
Another update: the shaders feature is practically done, at least in the first iteration. Every drawn game object has a Shader property, also: Camera, Viewport, and Screen, so you may have shader effects on them all in any combinations. Savegames work too; furthermore, it's possible to save game with one graphic driver and load it with another, and have shader settings persist (shaders may work differently or not work at all, as with software renderer, but all the shader settings are remembered regardless).
There are couple of remaining issues, but they will have to be addressed separately.
PR with full explanation and usage instructions:
https://github.com/adventuregamestudio/ags/pull/2716
Download experimental build from CI:
https://cirrus-ci.com/task/5684067657580544
NOTE: the very first post in this forum thread contains outdated information. I will replace it tomorrow when will have more spare time.
The feature has been merged into AGS 4 branch, and will be a part of the next AGS 4 Alpha Update.
I've rewritten the first post in this thread, please refer to it for the updated instructions, test build download, demo game, etc.
Hey, I wanted to play with this and had some ideas, but currently the only texture a shader instance can have access to, is the one it is attached to. I think this is due to how AGS works that it works with sprites, which may or may not have a texture associated.
I actually meant to try the simplest shader which I could think which would sample a texture and then use this as information to modify a the texture of the object the shader instance is attached to - my first idea would be to blatantly replace the pixels from one by the other and then work from there in other ideas.
Basically object oAaa with the image
(https://i.imgur.com/Ks4AtBo.png)
And object oBbb with the image
(https://i.imgur.com/gLJXGF8.png)
And then just wanted to use a shader to replace pixels from oBbb with the ones from oAaa - this would be just a test and then I would this to test possibilities.
Essentially the objects would look like this
(https://i.imgur.com/USE0EkL.png)
And then after using a shader on the object at the right it should look like this
(https://i.imgur.com/piQ5CS6.png)
Quote from: eri0o on Wed 21/05/2025 03:32:06Hey, I wanted to play with this and had some ideas, but currently the only texture a shader instance can have access to, is the one it is attached to. I think this is due to how AGS works that it works with sprites, which may or may not have a texture associated.
I had this in future plans. Did not think the implementation through yet.
But the general idea was to assign a sprite number as an extra "texture source" to the ShaderInstance. When this is done, the engine would create and lock a texture for that sprite in a texture cache, and assign to DDB as an extra sample texture. Then this extra texture is attached to a shader during render. Might try making an experiment in a few days.
Cool! I think with that I could do the texture copy, and then try to do texture tiled scrolling (this one I can do already, makes for some cool menu backgrounds), and then try affine transform - with the overlays I got my mode7 module to be quite fast, except for the ground which still requires lots of drawing surface manipulation to achieve it, but I wanted to see if I could do something with shaders instead - I think I can also hack this with having only a single Overlay that has the right texture and is resized to be screen size or something, but I am not sure yet.
Ok, it is not pretty but I made my first shader
uniform float iTime;
uniform int iGameFrame;
uniform sampler2D iTexture;
uniform vec2 iTextureDim;
uniform float iAlpha;
varying vec2 vTexCoord;
void main()
{
// this is both direction and speed, should be lower than 1.
vec2 dir = vec2(0.15, 0.05);
// width and height of the texture
vec2 wh = iTextureDim;
vec2 uv = vTexCoord;
vec2 scroll = mod(uv * wh + dir * iTime * wh, wh);
vec2 uv_1 = vec2 (scroll.x / wh.x , scroll.y / wh.y);
vec4 color = texture2D(iTexture, uv_1);
gl_FragColor = color;
}
The idea here is to have the texture be continuous scrolling to some direction.
(https://i.ibb.co/TB9gM4CJ/2025-05-2311-21-48-ezgif-com-optimize.gif) (https://imgbb.com/)
Then you can use like two overlays for a message box, one that has the shader has the infinite scrolling background and the other on top has the text with frames and a transparent background. I like this for puzzle games UIs.
The first expansion of the Shaders feature, is the support for attaching textures, which allows you to mix multiple input textures in a shader (up to 4).
PR: https://github.com/adventuregamestudio/ags/pull/2733
Download test build here: https://cirrus-ci.com/task/4706533503664128
New script command:
/// Sets a secondary shader's input texture, using a sprite number. Only indexes 1-3 are supported.
void ShaderInstance.SetTexture(int index, int sprite);
Note that you can only use indexes 1..3. Index 0 means primary texture, so it's not allowed here.
The sprite can be any sprite number, either regular or dynamicsprite, there's no difference.
Use in shaders:
In GLSL you must use hardcoded 'sampler2D' uniforms "iTexture2", "iTexture3" and "iTexture4".
In HLSL names are not hardcoded per se (although you may use the same), but the registers matter: the 'sampler2D' variables has to be appointed to registers "s0", "s1" and so on.
GLSL example:
Spoiler
#version 130
uniform sampler2D iTexture;
uniform sampler2D iTexture2;
uniform vec2 iTextureDim;
varying vec2 vTexCoord;
void main()
{
vec2 uv = vTexCoord.xy;
if (floor(mod(uv.x * iTextureDim.x, 2)) == 0 && floor(mod(uv.y * iTextureDim.y, 2)) == 0)
gl_FragColor = texture2D(iTexture, uv);
else
gl_FragColor = texture2D(iTexture2, uv);
}
HLSL example:
Spoiler
// Pixel shader input structure
struct PS_INPUT
{
float2 Texture : TEXCOORD0;
};
// Pixel shader output structure
struct PS_OUTPUT
{
float4 Color : COLOR0;
};
sampler2D iTexture : register( s0 );
sampler2D iTexture2 : register( s1 );
const float2 iTextureDim: register( c2 );
PS_OUTPUT main( in PS_INPUT In )
{
PS_OUTPUT Out; //create an output pixel
float2 uv = In.Texture;
float4 outpixel;
if (floor(fmod(uv.x * iTextureDim.x, 2)) == 0 && floor(fmod(uv.y * iTextureDim.y, 2)) == 0)
outpixel = tex2D(iTexture, uv);
else
outpixel = tex2D(iTexture2, uv);
Out.Color = outpixel;
return Out; //return output pixel
}
Above examples simply interleave 2 sprites together where odd pixels come from one sprite and even pixels come from another.
Quote from: Crimson Wizard on Wed 21/05/2025 04:27:24Hey, I wanted to play with this and had some ideas, but currently the only texture a shader instance can have access to, is the one it is attached to.
Ehy CW this is looking so amazing! Lately I tried experimenting with some shaders along with some tutorials and i might be facing an issue. If I attach a shader (eg. to Screen.Shader) then I can't assign another shader to it? I am passing the shader to null before assign it a new one but it doesn't seems to be working, i might be doing something wrong. Here i tried experimenting assigning different shaders to Screen, Camera, Room etc. and its all working good also with SetTexture (I use it for the CRT shader): https://imgur.com/S8CfpZN (https://imgur.com/S8CfpZN) Also for some reason if i run the game from the exe i don't see any shaders at all but its working fine in the editor.
Any chance somehow the exe and running from the Editor somehow makes it run with a different graphics driver? I think ctrl+v can show the graphics driver at runtime but it also can show it in the log - like, a glsl shader only works with OpenGL driver, hlsl has to be prebuilt and will only work with Directx9 and the software driver doesn't support any shader at all.
Quote from: Vincent on Tue 27/05/2025 00:40:22If I attach a shader (eg. to Screen.Shader) then I can't assign another shader to it? I am passing the shader to null before assign it a new one but it doesn't seems to be working, i might be doing something wrong.
Maybe there's some mistake in the engine, i will double check that.
Quote from: Vincent on Tue 27/05/2025 00:40:22Also for some reason if i run the game from the exe i don't see any shaders at all but its working fine in the editor.
There may be 2 reasons:
- you forgot to do Build EXE, and the exe in compiled folder remains an older version without shaders.
- somehow the exe uses a different config with another gfx driver set (that's unusual, but may happen in theory).
Quote from: Vincent on Tue 27/05/2025 00:40:22If I attach a shader (eg. to Screen.Shader) then I can't assign another shader to it? I am passing the shader to null before assign it a new one but it doesn't seems to be working, i might be doing something wrong.
I tried this, and it works correctly when I switch Screen.Shader to another shader instance. Please post the code that you are using.
I wrote the shaders only in hlsl in visual studio and the graphic driver is directx9 in both editor and at runtime. My forgetfulness because it was enough to make them prebuilt to make it work fine. I might also convert them to glsl just in case.
Quote from: Crimson Wizard on Tue 27/05/2025 10:31:49I tried this, and it works correctly when I switch Screen.Shader to another shader instance. Please post the code that you are using.
My bad, it was just a trivial mistake in the code, it's indeed working all good. I'm pretty much excited with the shaders features :) you guys are doing a beautiful work.
I'd like to encourage users to upload their test games with shaders and post links here, preferably with shader scripts. This may be used as a demonstration for those who are curious, and those who do not know shaders well.
For the reference, shader scripts do not have to be packed inside the game of course, they may also be placed just in the game's folder (or subfolder) and loaded from there at runtime. This makes them more visible (better serving demonstration purpose).
Quote from: Crimson Wizard on Tue 27/05/2025 15:32:57For the reference, shader scripts do not have to be packed inside the game of course, they may also be placed just in the game's folder (or subfolder) and loaded from there at runtime. This makes them more visible (better serving demonstration purpose).
Right i only had to add shaders in the package custom data in the general setting. I did these shaders following some tutorials, tho they need some editing to suit the SetConstantF so they can be changed via script. I've been trying converting the defines into shader constants and even if i dont get any errors it doesn't seems anything changes, I did a quick upload but I might can do a better one in the future with the full demostration: https://www.fileconvoy.com/dfl.php?id=gb4565f24b363a49010005925505cf67d38eb685b35 (https://www.fileconvoy.com/dfl.php?id=gb4565f24b363a49010005925505cf67d38eb685b35)
I tried to had a function like this but it didnt work:
function update_pixel_shader(float alpha, float scanlineIntensity, float useXBR, float useScanlines, float useSubpixelAA, float scaleFactor)
{
pixelInstance.SetConstantF2("iTextureDim", textureWidth, textureHeight); // c2
pixelInstance.SetConstantF("iAlpha", alpha); // c3
pixelInstance.SetConstantF("iScanlineIntensity", scanlineIntensity); // c4
pixelInstance.SetConstantF("USE_XBR", useXBR); // c5
pixelInstance.SetConstantF("USE_CRT_SCANLINES", useScanlines); // c6
pixelInstance.SetConstantF("USE_SUBPIXEL_AA", useSubpixelAA); // c7
pixelInstance.SetConstantF("SCALE_FACTOR", scaleFactor); // c8
}
// room_Load():
update_pixel_shader(
0.8, // alpha
0.2, // scanlineIntensity
1.0, // useXBR (enabled)
1.0, // useScanlines (enabled)
0.0, // useSubpixelAA (disabled)
4.0 // scaleFactor
);
I don't know why this in particular wouldn't work but I would like to note that iTextureDim is already a constant there that AGS fills with the dimension in pixels as float of the texture of the thing you are attaching the shader to.
Quote from: Vincent on Wed 28/05/2025 00:40:21I tried to had a function like this but it didnt work:
Please post the shader code itself. I cannot tell what's the link is pointing to, because unfortunately it returns error to me.
Shader compilation results are written into the engine logs. These may be seen in Log Panel when you run from the Editor (you have to setup Main message group output to Debug in order to see these).
Built-in shader constants, such as "iTextureDim" and "iAlpha" are set by the renderer automatically based on the sprite/texture parameters or game state. You should not be setting these yourself, as their values will be overwritten anyway.
Is the link broken? Cause I shared the whole project with all the shaders.
By the way this is one of the shader I was trying to set their costants, this is still without editing though: but other parameters were not changing either
Spoiler
// Pixel shader input structure
struct PS_INPUT
{
float2 Texture : TEXCOORD0;
};
// Pixel shader output structure
struct PS_OUTPUT
{
float4 Color : COLOR0;
};
// Configuration (set these to 1 to enable, 0 to disable)
#define USE_XBR 1 // Advanced edge detection (recommended)
#define USE_CRT_SCANLINES 1 // CRT-style scanlines
#define USE_SUBPIXEL_AA 0 // Subpixel anti-aliasing
#define SCALE_FACTOR 4.0 // Must match your render target upscale
// Global variables
sampler2D iTexture;
const float2 iTextureDim : register(c2);
const float iAlpha : register(c3);
// XBR (eXperimental Batch Rendering) edge detection
float4 getXBRColor(float2 uv)
{
float2 texel = 1.0 / iTextureDim;
// Sample 3x3 grid
float4 c11 = tex2D(iTexture, uv); // Center
float4 c00 = tex2D(iTexture, uv + texel * float2(-1, -1));
float4 c20 = tex2D(iTexture, uv + texel * float2(1, -1));
float4 c02 = tex2D(iTexture, uv + texel * float2(-1, 1));
float4 c22 = tex2D(iTexture, uv + texel * float2(1, 1));
// Calculate edge weights
float d_edge = (dot(abs(c00 - c22), 1) + dot(abs(c20 - c02), 1)) * 0.25;
float h_edge = (c20.r + c20.g + c20.b - c00.r - c00.g - c00.b) * 0.5;
float v_edge = (c02.r + c02.g + c02.b - c00.r - c00.g - c00.b) * 0.5;
// Blend based on edges
float blend_factor = smoothstep(0.0, 0.5, d_edge);
float4 result = c11;
if (abs(h_edge) > abs(v_edge))
{
result = lerp(result, (c20 + c00) * 0.5, blend_factor);
}
else
{
result = lerp(result, (c02 + c00) * 0.5, blend_factor);
}
return result;
}
// CRT scanline effect
float3 applyScanlines(float2 uv, float3 color)
{
float scanline = sin(uv.y * iTextureDim.y * 3.14159 * 2.0);
return color * (0.9 + 0.1 * scanline * scanline);
}
// Subpixel anti-aliasing
float3 applySubpixelAA(float2 uv, float3 color)
{
float2 texel = 1.0 / iTextureDim;
float2 subCoord = frac(uv * iTextureDim);
float4 c = tex2D(iTexture, uv);
float4 r = tex2D(iTexture, uv + float2(texel.x, 0));
float4 l = tex2D(iTexture, uv - float2(texel.x, 0));
float4 u = tex2D(iTexture, uv + float2(0, texel.y));
float4 d = tex2D(iTexture, uv - float2(0, texel.y));
// Weighted blend based on subpixel position
float3 result = color;
result = lerp(result, 0.5 * (c.rgb + r.rgb), smoothstep(0.3, 0.7, subCoord.x));
result = lerp(result, 0.5 * (c.rgb + l.rgb), smoothstep(0.7, 0.3, subCoord.x));
result = lerp(result, 0.5 * (c.rgb + u.rgb), smoothstep(0.3, 0.7, subCoord.y));
result = lerp(result, 0.5 * (c.rgb + d.rgb), smoothstep(0.7, 0.3, subCoord.y));
return result;
}
PS_OUTPUT main(in PS_INPUT In)
{
float2 uv = In.Texture;
float4 color = tex2D(iTexture, uv);
// Advanced upscaling techniques
#if USE_XBR
color = getXBRColor(uv);
#endif
#if USE_SUBPIXEL_AA
color.rgb = applySubpixelAA(uv, color.rgb);
#endif
#if USE_CRT_SCANLINES
color.rgb = applyScanlines(uv, color.rgb);
#endif
// Output with alpha
PS_OUTPUT Out;
Out.Color = float4(color.rgb, color.a * iAlpha);
return Out;
}
So, this is a HLSL shader, and it needs a "<shadername>.d3ddef" file in case you are using custom constants. This is explained in the first post.
Example of d3ddef file for your shader:
[constants]
iTextureDim = 2
iAlpha = 3
iOutputDim = 4
iScanlineIntensity = 5
USE_XBR = 6
USE_CRT_SCANLINES = 7
USE_SUBPIXEL_AA = 8
Then, you do not need a SCALE_FACTOR for the full-screen shader, instead use "iOutputDim" built-in constant - it tells final resolution of the fullscreen texture.
ok i added the .d3ddef file in the shader folder and edited the glsl and hlsl:
hlsl:
Spoiler
// Pixel shader input structure
struct PS_INPUT
{
float2 Texture : TEXCOORD0;
};
// Pixel shader output structure
struct PS_OUTPUT
{
float4 Color : COLOR0;
};
// Configuration uniforms
const float USE_XBR : register(c4);
const float USE_CRT_SCANLINES : register(c5);
const float USE_SUBPIXEL_AA : register(c6);
const float2 iOutputDim : register(c7);
// Engine-provided uniforms
sampler2D iTexture : register(s0);
const float2 iTextureDim : register(c2);
const float iAlpha : register(c3);
// XBR (eXperimental Batch Rendering) edge detection
float4 getXBRColor(float2 uv)
{
float2 texel = 1.0 / iTextureDim;
// Sample 3x3 grid
float4 c11 = tex2D(iTexture, uv); // Center
float4 c00 = tex2D(iTexture, uv + texel * float2(-1, -1));
float4 c20 = tex2D(iTexture, uv + texel * float2(1, -1));
float4 c02 = tex2D(iTexture, uv + texel * float2(-1, 1));
float4 c22 = tex2D(iTexture, uv + texel * float2(1, 1));
// Calculate edge weights
float d_edge = (dot(abs(c00 - c22), 1) + dot(abs(c20 - c02), 1)) * 0.25;
float h_edge = (c20.r + c20.g + c20.b - c00.r - c00.g - c00.b) * 0.5;
float v_edge = (c02.r + c02.g + c02.b - c00.r - c00.g - c00.b) * 0.5;
// Blend based on edges
float blend_factor = smoothstep(0.0, 0.5, d_edge);
float4 result = c11;
if (abs(h_edge) > abs(v_edge))
{
result = lerp(result, (c20 + c00) * 0.5, blend_factor);
}
else
{
result = lerp(result, (c02 + c00) * 0.5, blend_factor);
}
return result;
}
// CRT scanline effect (now resolution-aware)
float3 applyScanlines(float2 uv, float3 color)
{
// Base scanline density on output resolution
float scanlineDensity = iOutputDim.y / 600.0; // 600 = reference resolution
float scanline = sin(uv.y * scanlineDensity * 3.14159 * 2.0);
return color * (0.9 + 0.1 * scanline * scanline);
}
// Subpixel anti-aliasing (resolution-aware)
float3 applySubpixelAA(float2 uv, float3 color)
{
// Use output dimensions for AA scaling
float2 texel = 1.0 / iOutputDim;
float2 subCoord = frac(uv * iOutputDim);
float4 c = tex2D(iTexture, uv);
float4 r = tex2D(iTexture, uv + float2(texel.x, 0));
float4 l = tex2D(iTexture, uv - float2(texel.x, 0));
float4 u = tex2D(iTexture, uv + float2(0, texel.y));
float4 d = tex2D(iTexture, uv - float2(0, texel.y));
// Weighted blend based on subpixel position
float3 result = color;
result = lerp(result, 0.5 * (c.rgb + r.rgb), smoothstep(0.3, 0.7, subCoord.x));
result = lerp(result, 0.5 * (c.rgb + l.rgb), smoothstep(0.7, 0.3, subCoord.x));
result = lerp(result, 0.5 * (c.rgb + u.rgb), smoothstep(0.3, 0.7, subCoord.y));
result = lerp(result, 0.5 * (c.rgb + d.rgb), smoothstep(0.7, 0.3, subCoord.y));
return result;
}
PS_OUTPUT main(in PS_INPUT In)
{
float2 uv = In.Texture;
float4 color = tex2D(iTexture, uv);
// Advanced upscaling techniques
if (USE_XBR > 0.5)
{
color = getXBRColor(uv);
}
if (USE_SUBPIXEL_AA > 0.5)
{
color.rgb = applySubpixelAA(uv, color.rgb);
}
if (USE_CRT_SCANLINES > 0.5)
{
color.rgb = applyScanlines(uv, color.rgb);
}
// Output with alpha
PS_OUTPUT Out;
Out.Color = float4(color.rgb, color.a * iAlpha);
return Out;
}
glsl:
Spoiler
// Configuration uniforms (controlled from AGS)
uniform float USE_XBR;
uniform float USE_CRT_SCANLINES;
uniform float USE_SUBPIXEL_AA;
uniform vec2 iOutputDim;
// Standard engine-provided uniforms
uniform float iTime;
uniform int iGameFrame;
uniform sampler2D iTexture;
uniform vec2 iTextureDim;
uniform float iAlpha;
varying vec2 vTexCoord;
// XBR (eXperimental Batch Rendering) edge detection
vec4 getXBRColor(vec2 uv)
{
vec2 texel = 1.0 / iTextureDim;
// Sample 3x3 grid
vec4 c11 = texture2D(iTexture, uv); // Center
vec4 c00 = texture2D(iTexture, uv + texel * vec2(-1, -1));
vec4 c20 = texture2D(iTexture, uv + texel * vec2(1, -1));
vec4 c02 = texture2D(iTexture, uv + texel * vec2(-1, 1));
vec4 c22 = texture2D(iTexture, uv + texel * vec2(1, 1));
// Calculate edge weights
float d_edge = (dot(abs(c00 - c22), vec4(1)) + dot(abs(c20 - c02), vec4(1))) * 0.25;
float h_edge = (c20.r + c20.g + c20.b - c00.r - c00.g - c00.b) * 0.5;
float v_edge = (c02.r + c02.g + c02.b - c00.r - c00.g - c00.b) * 0.5;
// Blend based on edges
float blend_factor = smoothstep(0.0, 0.5, d_edge);
vec4 result = c11;
if (abs(h_edge) > abs(v_edge))
{
result = mix(result, (c20 + c00) * 0.5, blend_factor);
}
else
{
result = mix(result, (c02 + c00) * 0.5, blend_factor);
}
return result;
}
// CRT scanline effect
vec3 applyScanlines(vec2 uv, vec3 color)
{
float scanline = sin(uv.y * iTextureDim.y * 3.14159 * 2.0);
return color * (0.9 + 0.1 * scanline * scanline);
}
// Subpixel anti-aliasing
vec3 applySubpixelAA(vec2 uv, vec3 color)
{
vec2 texel = 1.0 / iTextureDim;
vec2 subCoord = fract(uv * iTextureDim);
vec4 c = texture2D(iTexture, uv);
vec4 r = texture2D(iTexture, uv + vec2(texel.x, 0));
vec4 l = texture2D(iTexture, uv - vec2(texel.x, 0));
vec4 u = texture2D(iTexture, uv + vec2(0, texel.y));
vec4 d = texture2D(iTexture, uv - vec2(0, texel.y));
// Weighted blend based on subpixel position
vec3 result = color;
result = mix(result, 0.5 * (c.rgb + r.rgb), smoothstep(0.3, 0.7, subCoord.x));
result = mix(result, 0.5 * (c.rgb + l.rgb), smoothstep(0.7, 0.3, subCoord.x));
result = mix(result, 0.5 * (c.rgb + u.rgb), smoothstep(0.3, 0.7, subCoord.y));
result = mix(result, 0.5 * (c.rgb + d.rgb), smoothstep(0.7, 0.3, subCoord.y));
return result;
}
void main()
{
vec2 uv = vTexCoord;
vec4 color = texture2D(iTexture, uv);
// Calculate scaling based on output dimensions
vec2 pixelSize = 1.0/iOutputDim;
// Advanced upscaling techniques
if (USE_XBR > 0.5) {
color = getXBRColor(uv);
}
if (USE_SUBPIXEL_AA > 0.5) {
color.rgb = applySubpixelAA(uv, color.rgb);
}
if (USE_CRT_SCANLINES > 0.5) {
// Make scanlines scale with output resolution
float scanlineDensity = iOutputDim.y / 600.0; // Base on 600p
color.rgb = applyScanlines(uv*scanlineDensity, color.rgb);
}
gl_FragColor = vec4(color.rgb, color.a * iAlpha);
}
d3ddef:
Spoiler
[compiler]
target = ps_2_b
[constants]
iGameFrame = 1
iTextureDim = 2
iAlpha = 3
USE_XBR = 4
USE_CRT_SCANLINES = 5
USE_SUBPIXEL_AA = 6
iOutputDim = 7 ;
room script:
Spoiler
function SetPixelSettings(float useXBR, float useScanlines, float useSubpixelAA)
{
pixelInstance.SetConstantF("USE_XBR", 1.);
pixelInstance.SetConstantF("USE_CRT_SCANLINES", 1.);
pixelInstance.SetConstantF("USE_SUBPIXEL_AA", 0.);
}
function initialize_shaders()
{
PixelShader = ShaderProgram.CreateFromFile("$DATA$/shaders/PixelShader.glsl");
pixelInstance = PixelShader.CreateInstance();
SetPixelSettings(1., 1., 1.);
}
function room_RepExec()
{
if (pixelactive) Room.BackgroundShader = pixelInstance;
else Room.BackgroundShader = null;
}
it was just a quick test tho but its working all good now indeed, thanks. I might come out with a proper demo soon or later. But in the meantime I hope you took the demo I shared.
Updated the first post with the new download link, now this version supports up to 3 secondary textures attached to a shader. Instructions are also updated in the first post.
For a quicker reference, the update contents are explained in this comment:
https://www.adventuregamestudio.co.uk/forums/engine-development/experiment-ags-4-custom-shaders-support/msg636683542/#msg636683542
I have a small request, can someone download the demo game (linked in this thread's first post), enable "background" shader ("Bg" button), and tell how it looks like?
You should be seeing "ripple" effect, but does it look like smooth waves, or plain circles expanding from the center?
It worked well for me in the past, but when I test this now, I have a different result. I am in doubts whether there's something wrong with the game which I did not notice earlier, or something wrong with my video card (I had suspicions that it might be malfunctioning lately).
EDIT: okay, got this fixed by rebooting PC. No idea what caused this, either a random glitch or some other software put gfx card into a wrong state... :/
Anyway, the shaders feature is now a part of AGS 4 since Alpha 22 update:
https://www.adventuregamestudio.co.uk/forums/index.php?msg=636683993
I've had issues with shaders existing beyond when they should when debugging and halting AGS engine in Visual Studio, ages ago when I played with the OpenGL ES code for shaders used in the tint and stuff. I could solve it by restarting usually. I don't know how to explain but the game would boot and have ghosts tints, shaped like sprites, and only happened in my PC with Nvidia. My video card was new at the time.
I don't have a target use for this now, but if there was a way to load a shader from a string, with somehow accounting for the different graphics driver, it would be possible to have shaders in script modules without packaging them separately.
Edit: nevermind, just remembered that DirectX exists.
Edit 2: I guess a possible progression of this is adding the text editor for glsl and hsls right in the editor under a new project node named shader.
Quote from: eri0o on Tue 01/07/2025 16:35:47I don't have a target use for this now, but if there was a way to load a shader from a string, with somehow accounting for the different graphics driver, it would be possible to have shaders in script modules without packaging them separately.
I don't know yet, but as a workaround you may write files from script, and then create shaders from them.
That is a good alternative for script modules that I didn't thought about. But it may work.