Game Development 101: (incomplete) a. Basics of Game Development b. Designing your Video Game c. Math in Video Games d. AI Decision-making & Pathfinding e. Asset Creation & Game Art f. Physics & Collision Detection g. Shaders & VFX in Games h. Multiplayer & Networking in Games |
Visual Effects (VFX)
VFX is a category of effects that enhance the visual experience in games. VFX can include various elements like particle systems, dynamic lighting, explosions, fire, smoke, weather effects, and more. Shaders play a key role in creating visual effects in games, but not all visual effects are achieved solely through shaders. Many effects involve complex simulations, animations, and interactions.
Shaders
Shaders are programs that run on GPU (instead of CPU) & perform computation of graphics in real-time. They are responsible for defining how light interacts with surfaces, how colors are displayed, and various visual effects such as reflections, shadows, and more.
Shaders are mathematical in nature. We have vectors representing lights, pixel colors, geometry & so on & we use math functions (such as dot product, cross product, interpolations…) to compute the final color of a pixel on screen.
// An example shader program that prints all the pixels red
void main() {
gl_FragColor = vec4(1.0, 0.0, 0.0,1.0);
}
Run this code here.
The above program printed all the pixels with red color; even though we just wrote 1 line of code specifying the color. The reason is that; in GPU, all pixels are running this program separately (in parallel); and no two pixels can share any information. It is because GPUs are designed to be parallel.
The book of shaders is worth a read.
CPU vs GPU Compute
If we have a 1024×1024 resolution screen, computation on CPU will require 1,048,576 pixels to be computed. For each pixel, there are many operations needed to perform to make it look right (sometimes tens or hundreds of calculations per pixel). This will make CPU unable to process this much data if we need 60 FPS (60 such computations per second for a normal game). This is because CPUs are designed to be sequential. They are intended to process the instructions one after the other (non parallel compute).
To solve the issue, GPUs are developed which are extremely parallel. So all of the pixels are computed at the same time on separate threads; or cores of GPU. This optimizes the problem of speed; but the drawback is that now we cannot access data between different pixels (as they are running separate instances of shader program per pixel).
Writing Shaders
Shadertoy has some good examples of shaders for inspiration. We will use Godot Engine to test our shaders since it provides a simplified GLSL shader programming language.
Open Godot, create a project & then press Ctrl+A to create a node. Search for “ColorRect” node in the popup & add it to scene & then apply shader material to it (right-side panel). In shader material, add a shader & then we can get started with writing our first shader.
Our 1st Shader
In 1st shader, we will display an image, then we will apply some animated filter effects on the image.
shader_type canvas_item;
uniform sampler2D image;
void vertex() {
// Called for every vertex the material is visible on.
}
void fragment() {
// Called for every pixel the material is visible on.
COLOR = texture(image, UV).rgba;
}
Shader Uniforms
Variable ‘image’ is a uniform which means that it is the data that comes from CPU. In shaders, uniform is a way to communicate between CPU & GPU. All the data of game objects are transferred to GPU via uniforms before GPU can compute anything.
In our shader, we sent the image. But there is no value assigned to it. This is because the value will be assigned by our CPU-side programming language. In Godot, an easy way to assign is to look at the right-side panel and expand the “Shader Parameters” section. There you can assign it any image. Another alternate is to use programming code function:
my_shader_material.set_shader_parameter(“image”, load(“res://path/to/img”))
UV Coordinates
After you have assigned the image, you will be able to see it immediately. This shader samples the image we passed to the GPU which means that the color of each pixel is set to the uniform image’s pixel color at coordinate position UV of the pixel. UV is a vec2 value which is the position of current pixel. Its components’ values are in range(0.0, 1.0). A pixel at the center of screen will have UV value of vec2(0.5, 0.5) (if we are talking about screen_uv). In our above shader, the UV value of vec2(0.5, 0.5) is the center of the ColorRect.
Applying Chromatic Aberration Effect to Image
Instead of sampling full texture at UV, we will sample individual RGB components with slight offset for each component. This will result in individual components being displaced from original position, and we will get a nice Lo-fi effect.
shader_type canvas_item;
uniform sampler2D image;
void vertex() {
// Called for every vertex the material is visible on.
}
void fragment() {
// Called for every pixel the material is visible on.
COLOR.r = texture(image, UV + vec2(0.0, 0.01)).r;
COLOR.g = texture(image, UV + vec2(0.01, 0.01)).g;
COLOR.b = texture(image, UV + vec2(0.01, 0.0)).b;
}
3D & Vertex Shaders
Vertex shader is used to deform geometry itself rather than just coloring the pixels. Just as we write pixel’s color in fragment shader, we write the results of our calculation on vertex’s position.
Following shader assigns position of surface vertices based on Gerstner wave function (for ocean shader here).
More on Shaders
Some good sources to learn about shaders:
- The Book of Shaders (online website, very good)
- Shadertoy.com (A good collection of shaders)
- godotshaders.com (A very elegant collection of shaders)
- Godot’s Tutorial of Writing Your First Shader
Game Development 101: (incomplete) a. Basics of Game Development b. Designing your Video Game c. Math in Video Games d. AI Decision-making & Pathfinding e. Asset Creation & Game Art f. Physics & Collision Detection g. Shaders & VFX in Games h. Multiplayer & Networking in Games |