0% found this document useful (0 votes)
528 views212 pages

Espíndola Fabrizio - The Godot Shaders Bible - 2025

The Godot Shaders Bible is a comprehensive guide for enhancing game visuals through shader development, authored by Fabrizio Espíndola. It covers fundamental concepts of mesh composition, lighting, rendering, and procedural shapes, providing practical exercises and insights for developers. The book aims to equip readers with the necessary skills to create compelling visual effects in the Godot game engine.

Uploaded by

sharadalloy567
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
528 views212 pages

Espíndola Fabrizio - The Godot Shaders Bible - 2025

The Godot Shaders Bible is a comprehensive guide for enhancing game visuals through shader development, authored by Fabrizio Espíndola. It covers fundamental concepts of mesh composition, lighting, rendering, and procedural shapes, providing practical exercises and insights for developers. The book aims to equip readers with the necessary skills to create compelling visual effects in the Godot game engine.

Uploaded by

sharadalloy567
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 212

The Godot Shaders Bible.

Everything you need to know about shaders to enhance your game’s visuals.

Fabrizio Espíndola

#Godot #Shaders #GameDev

First Edition 2025

The Godot Shaders Bible, version 0.0.6.


Published by Jettelly Inc. ® All rights reserved. jettelly.com
77 Bloor St. West, Suite #663, Toronto ON M5S 1M2, Canada.
Credits.
Author.
Fabrizio Espíndola

Design.
Pablo Yeber

Head Editor.
Ewan Lee
Content.

Preface. 6
About the Author. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
Errata. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
Piracy. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7

Chapter 1: Introduction to Mesh Composition. 8


1.1 Properties. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
1.2 Mesh Data. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
1.3 Introduction to Spaces. . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
1.4 Configuring a Unique Object. . . . . . . . . . . . . . . . . . . . . . 25
1.5 Rendering Stages and Pipeline. . . . . . . . . . . . . . . . . . . . . 32
1.6 Working with World Space Coordinates. . . . . . . . . . . . . 38
1.7 Introduction to Tangent Space. . . . . . . . . . . . . . . . . . . . . 48
1.8 Implementing Tangent Space in Our Shader. . . . . . . . 51
1.9 Linear Interpolation Between Colors. . . . . . . . . . . . . . . . 64
1.10 Introduction to Matrices. . . . . . . . . . . . . . . . . . . . . . . . . . . 71
1.11 Built-in Matrices. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74
1.12 Implementing Custom Matrices. . . . . . . . . . . . . . . . . . . 83

Chapter 2: Lighting and Rendering. 92


2.1 Material Properties. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93
2.2 Introduction to the Lambertian Model. . . . . . . . . . . . . . 96
2.3 Implementing the Lambertian Model. . . . . . . . . . . . . . . 99
2.4 Vector: Points vs. Directions. . . . . . . . . . . . . . . . . . . . . . . 111
2.5 Introduction to the Blinn-Phong Model. . . . . . . . . . . . . . 112
2.6 Implementing the Blinn-Phong Model. . . . . . . . . . . . . . 115
2.7 From Linear Lighting to sRGB. . . . . . . . . . . . . . . . . . . . . . . 122
2.8 Introduction to the Rim (Fresnel) Effect. . . . . . . . . . . . . 127
2.9 Implementing the Rim Effect. . . . . . . . . . . . . . . . . . . . . . . 130
2.10 Introduction to the Anisotropic Effect. . . . . . . . . . . . . . . 138
2.11 Implementing the Ashikhmin-Shirley Model. . . . . . . . . 146

Chapter 3: Procedural Shapes and Vertex Animation. 163


3.1 Drawing with Functions and Inequalities. . . . . . . . . . . . 164
3.2 Variables for Aimation and Procedural Modification. . 177
3.3 Implementation of the Procedural Character in GDSL. 182

Special thanks. 206


Preface.

About the Author.

Fabrizio Espindola is a renowned Technical Artist and developer from Santiago, Chile, with
extensive experience in Computer Graphics and a deep passion for creating shaders in
environments like Unity and, more recently, Godot.

He is the Author of ”The Unity Shaders Bible and” the books ”Visualizing Equations VOL. 1 &
2,” essential resources for developers and Technical Artists who want to master the art of
procedural shapes. His practical and detailed approach has helped over 15,000 professionals
worldwide enhance their skills and understanding in developing special effects for video
games.

With more than 17,000 followers on LinkedIn, Fabrizio is an influential figure in the game
development community. He regularly shares knowledge, tutorials, assets, and updates
of the latest trends in real-time graphics. His ability to simplify complex concepts and his
commitment to education have made him a reference for many developers.

Since 2022, Fabrizio has been working at Rovio, the creators of Angry Birds, as a Senior Techni-
cal Artist. Previously, he collaborated with renowned companies like DeNA and Osmo, where
he gained extensive experience in creating and optimizing assets for various applications. His
passion for teaching has led him to write books, tutorials, and manuals, sharing his knowledge
with the independent developer community.

For more information about his work and projects, visit jettelly.com

Errata.

While writing this book, we have taken precautions to ensure the accuracy of its content.
Nevertheless, you must remember that we are human beings, and it is highly possible that
some points may not be well-explained or there may be errors in spelling or grammar.

If you come across a conceptual error, a code mistake, or any other issue, we appreciate
you sending a message to [email protected] with the subject line “VE2 Errata.” By doing
so, you will be helping other readers reduce their frustration and improving each subsequent
version of this book in future updates.

The Godot Shaders Bible. 6


Preface.

Furthermore, if you have any suggestions regarding sections that could be of interest to
future readers, please do not hesitate to send us an email. We would be delighted to include
that information in upcoming editions.

Piracy.

Before copying, reproducing, or distributing this material without our consent, it is important
to remember that Jettelly Inc. is an independent and self-funded studio. Any illegal practices
could negatively impact the integrity of our work.

This book is protected by copyright, and we will take the protection of our licenses very
seriously. If you come across this on a platform other than jettelly.com or discover an illegal
copy, we sincerely appreciate it if you contact us via email at [email protected] (and
attach the link if possible), so that we can seek a solution. We greatly appreciate your
cooperation and support. All rights reserved.

The Godot Shaders Bible. 7


Introduction to Mesh Composition.

Chapter 1

Introduction to Mesh
Composition.

The Godot Shaders Bible. 8


Introduction to Mesh Composition.

In this chapter, you’ll take your first steps into shader development — an essential skill for
crafting everything you see on a computer screen. But what exactly is a shader? In Godot, a
shader is a small program with the .gdshader extension that runs on the GPU. It processes
every visible pixel in the scene, allowing you to draw and visually modify objects using
coordinates, colors, textures, and other properties derived from the object’s geometry.

Shaders take advantage of the GPU’s parallel architecture. A modern GPU contains thousands
of small cores that can execute operations simultaneously, making it ideal for performance-
intensive tasks such as rendering, visual effects, and dynamic lighting.

Before you start writing your own shaders, it’s important to understand the building blocks
that make them work. You’ll explore the internal and external properties you can control
through code. This foundational knowledge will help you understand how shaders operate,
how they interact with the scene, and how you can harness them to create unique and
compelling visual effects.

1.1 Properties.

Before you dive into shaders and visual effects, it’s important to understand what a mesh is
in Godot. According to the official Godot documentation:

Quote-left
Mesh is a type of Resource that contains vertex arrays-based geometry,
divided in surfaces. Each surface contains a completely separate array and a
material used to draw it. Design wise, a mesh with multiple surfaces is
preferred to a single surface, because objects created in 3D editing software
commonly contain multiple materials.
https://docs.godotengine.org/en/stable/classes/class_mesh.html

Quote-Right
This definition is consistent with those found in other 3D tools, such as Autodesk Maya or
Blender, as they share fundamental principles in defining and manipulating 3D geometry. In
the context of shader and visual effects development, meshes serve as the foundation upon
which materials, textures, and other resources are applied — bringing life and personality to
the models displayed on screen.

The Godot Shaders Bible. 9


Introduction to Mesh Composition.

Analyzing the previous description, we can identify two key components that deserve closer
attention: vertex matrices and materials, with the latter being essential for rendering a mesh.
In computer graphics, both numerical data — such as geometry, coordinates, and attributes
— and visual resources that enable objects to be displayed on screen must be considered.
Understanding this duality between data (geometry, coordinates, attributes) and visual
representation (materials, textures, shaders) is fundamental to your role as a technical artist.

When we talk about vertex arrays, we refer to the ordered collection of points that define an
object’s shape. A mesh does more than just store these points — it also integrates various
properties that, together, form the polygonal faces (triangles) that shape the model. These
properties include:

Chevron-Circle-Right Vertices. Chevron-Circle-Right Bitangents (or Binormals).


Chevron-Circle-Right Normals. Chevron-Circle-Right Tangents.
Chevron-Circle-Right UV Coordinates. Chevron-Circle-Right Vertex Color.

To understand the importance of these properties, imagine holding a sheet of paper. If


someone asked you to draw on the “front” side, how would you determine which side that is
if both look identical? This is where normals come into play — they indicate the direction
a surface is facing, helping define which side will be visible when viewing the object. In
3D applications like Maya or Blender, the software typically renders only the faces whose
normals point toward the observer (camera), ignoring or not rendering those that are flipped.
This optimization technique improves performance by reducing the number of unnecessary
calculations during rendering.

Tangents and bitangents are also important, though they play a slightly lesser role than
normals in basic visual effects. Continuing with the sheet of paper analogy, if the sheet
has four vertices and four normals, you can also define four tangents and four bitangents
associated with them.

Together, normals, tangents, and bitangents form a local coordinate system for the model’s
surface, which is essential for applying normal maps and achieving advanced lighting
effects. This foundational concept will be crucial when designing, optimizing, and refining
your shaders and visual effects, as it helps you understand how geometry and graphical
resources interact in the creation of 3D environments and characters.

The Godot Shaders Bible. 10


Introduction to Mesh Composition.

•••

(1.1.a Conceptualizing Mesh properties on a notebook page)

The tangent is typically aligned with the U-axis of the UV coordinates, while the bitangent
is oriented along the V-axis. Together, they define a local reference system at each vertex,
known as tangent space, which is essential for computing advanced effects such as normal
mapping, reflection simulation, and other complex visual effects. This space allows you to
interpret additional texture details relative to the surface of the mesh, giving you greater
control over the final visual outcome.

Later in this book, we will dedicate an entire section to creating visual effects using tangent
space, UV coordinates, and vertex color. For now, it’s important to highlight that for a mesh’s
properties to be visible in the engine, it must be linked to a rendering process and one or
more materials.

In Godot, the API responsible for processing a mesh’s geometry and displaying it in the scene
is the RenderingServer. Without this API, the mesh would exist solely as a data resource in
memory, with no visual representation or interaction with the camera or lights. Its represen-
tation in the engine takes shape through a node called VisualInstance3D, which serves as
the entry point for manipulating 3D objects. One of its most commonly used child nodes is
MeshInstance3D, which internally manages the visual interactions between the mesh and

The Godot Shaders Bible. 11


Introduction to Mesh Composition.

its material. This allows you to work more intuitively, without having to deal directly with the
technical details of the rendering process.

From the MeshInstance3D node, you can access both the mesh and its material directly
from the Inspector. In Godot, a material is a resource that defines how an object’s surface is
rendered, determining its interaction with light, shadows, textures, and other visual effects.
At its core, a material contains a shader — a program that runs on the GPU. Godot uses a
simplified variant of GLSL (OpenGL Shading Language) to define these shaders, making them
easier to edit and understand.

Through the shader, you can access the mesh’s properties. For instance, the predefined
variable VERTEX allows you to manipulate vertex positions, enabling direct modifications to
the mesh within the shader. Similarly, variables such as NORMAL, TANGENT, and BITANGENT
provide control over normals, tangents, and bitangents, respectively. Understanding how
these variables work requires visualizing the numerical values stored in the Vertex Data. By
doing so, you can better grasp how different properties interact to produce the final visual
output on screen.

1.2 Mesh Data.

In this section, you’ll carry out a hands-on exercise to understand how vertex lists, and
their properties are grouped, and then read by the GPU through a shader to visually render
geometry on-screen. To do this, you’ll create a QuadMesh in .obj format using a simple text
editor (such as Notepad).

First, create a new Godot project and name it Jettelly Godot Course. Make sure to select
Forward+ as the Renderer. Later, we will explore the architecture of a Rendering Pipeline and
the differences between various rendering techniques. For now, assume that you’ll be using
this project throughout the entire learning process on a desktop computer. This approach
ensures a consistent reference setup as you progress.

The Godot Shaders Bible. 12


Introduction to Mesh Composition.

•••

(1.2.a Creating a new project in Godot 4.3)

Once your project is open, start by organizing your resources. Navigate to the FileSystem
panel and create your first folder, naming it assets. From this point on, all generated files will
be stored in subfolders within assets, providing a cleaner workspace structure and making
version control easier. For instance, if you are using Git, you can link this folder and commit
changes exclusively to its contents, preventing accidental modifications to the project’s main
configuration.

Inside assets, create a new subfolder called chapter_01, and then, within it, create another
folder named custom_mesh. This is where you’ll store the files needed for the upcoming
exercises

The Godot Shaders Bible. 13


Introduction to Mesh Composition.

•••

(1.2.b Initial project structure)

As mentioned, you’ll create the QuadMesh for this exercise using a text editor. The process is
straightforward: navigate to the custom_mesh folder in your operating system and create
a new text file. On Windows, you can do this by right-clicking the destination folder and
selecting New > Text Document, as shown below:

•••

(1.2.c Creating a text document in Windows)

Name this document custom_quad, and change its extension from .txt to .obj. This .obj
format is a standard for representing 3D geometry. However, after making this change and
returning to Godot, you might notice a warning icon (an “X”) next to the file. This indicates
that the file is ”corrupt” or incomplete — which is expected, as we haven’t yet added the
necessary data to define the QuadMesh’s geometry. In the next steps, you’ll define all the
required information so that Godot can correctly interpret and render the object.

The Godot Shaders Bible. 14


Introduction to Mesh Composition.

When exporting 3D models in .obj format, an additional file with the .mtl (Material Template
Library) extension is also generated. This file contains information about the object’s lighting
properties. For now, we won’t focus on this file, as our main priority is the geometry itself. We’ll
let Godot assign its default lighting values. Later in this book, you’ll learn how to work with
materials and explore how this library affects the final appearance of your geometry.

A basic .obj file structure typically includes:

Chevron-Circle-Right Definitions of vertex positions in local space.


Chevron-Circle-Right Definitions of UV coordinates in uv space.
Chevron-Circle-Right Definitions of normals in local space.
Chevron-Circle-Right Smoothing angle parameters.
Chevron-Circle-Right Model face groups.
Chevron-Circle-Right Model face definitions.

To properly visualize the QuadMesh in Godot, you’ll need to define this information. This
ensures that the GPU can project the geometry onto the screen. Before you begin, create a
3D scene by choosing a Node3D as the parent node. Then, add a MeshInstance3D node as
its child. This node will allow you to assign the custom mesh you just created and render it
within the scene.

•••

(1.2.d Selecting a 3D scene)

The Godot Shaders Bible. 15


Introduction to Mesh Composition.

According to the official documentation:

Quote-left
MeshInstance3D is a node that takes a Mesh resource and adds it to the
current scene by creating an instance of it. This is the class most often used to
render 3D geometry and can be used to instance a single Mesh in many
places.
https://docs.godotengine.org/en/stable/classes/class_meshinstance3d.html

Quote-Right
If you select the MeshInstance3D in the Scene window and navigate to the Inspector, you’ll
notice that its first property is Mesh, which allows you to assign a mesh resource directly.
Attempting to do this now with your custom_quad.obj will not work, because you have not
yet defined the object’s data, and it remains marked as corrupt. To fix this, open the .obj file
in your preferred text editor and add the necessary properties.

Since Godot uses meters by default, you’ll use a unit system where a range of -0.5 to 0.5
represents a 1x1 meter QuadMesh. This approach makes it easier for you to visualize and
adjust the geometry as needed.

•••

(1.2.e Defining the vertices for a QuadMesh)

In the Figure 1.2.e, four vertices have been defined, one at each corner of the QuadMesh,
following a clockwise order for convenience. However, when defining the faces later, we will
use the reverse order to align with Godot’s conventions, where the +𝑧 axis faces the camera.
In .obj files, vertices are defined using the letter v.

The Godot Shaders Bible. 16


Introduction to Mesh Composition.

•••

(1.2.f The QuadMesh’s vertices defined in a clockwise direction)

The VERTEX variable in the shader will give you access to these values at runtime, allowing
you to eventually modify their positions or manipulate the geometry in more complex ways.
While this example only involves four vertices, a realistic model could contain thousands or
even millions of them, organized into different surfaces and matrices. For instance, imagine
you have a 3D warrior divided into two separate pieces — body and sword — the model
would likely have one matrix for the entire body and another for the sword.

Next, let’s define the UV coordinates, represented as vt in .obj files. These coordinates, ranging
from 0.0 to 1.0, allow you to map textures onto a surface. In Section 1.3, we will explore this type
of space in greater detail. For now, assign a UV coordinate to each vertex of the QuadMesh
using the following values:

Chevron-Circle-Right Vertex #1 (bottom-left): [0.0, 0.0].


Chevron-Circle-Right Vertex #2 (top-left): [0.0, 1.0].
Chevron-Circle-Right Vertex #3 (top-right): [1.0, 1.0].
Chevron-Circle-Right Vertex #4 (bottom-right): [1.0, 0.0].

The Godot Shaders Bible. 17


Introduction to Mesh Composition.

•••

(1.2.g Defining UV Coordinates on a QuadMesh)

These coordinates are essential for projecting textures onto the object’s surface. In the shader,
the built-in variable UV lets you access and manipulate these values, making it particularly
useful for creating visual effects, shifting textures across the surface, or even implementing
more complex mapping techniques.

•••

(1.2.h Visualizing the UV coordinates)

The next step is to define the normals, represented in .obj files by vn. Normals are vectors
that indicate the orientation of the surface and are essential for calculating lighting. In this

The Godot Shaders Bible. 18


Introduction to Mesh Composition.

case, we will define the normals so that they face toward the viewer. This means their XY
components will be 0.0, while the Z component will be 1.0, pointing forward.

•••

(1.2.i Defining the Normals on a QuadMesh)

It is worth noting that the Z component could just as easily be -1.0. However, a positive value
has been chosen to align with Godot’s spatial axes and its default directional lighting.

•••

(1.2.j Visualizing Normalized Normals on a QuadMesh)

The Godot Shaders Bible. 19


Introduction to Mesh Composition.

The NORMAL variable in the shader will give you access to this information, and by manipulating
it, you can change how light interacts with the surface. This opens the door to a wide range
of effects — from subtle changes in the appearance of lighting to creating complex visual
distortions.

Up to this point, we’ve defined vertices, UV coordinates, and normals — the basic elements
the GPU needs to understand your geometry’s shape and orientation. However, we still need
to specify how these vertices are grouped to form polygonal faces. In an .obj file, faces are
defined with the letter f, followed by the vertex index, UV coordinate index, and normal index,
in that order, separated by slashes (/).

•••

(1.2.k Defining the Mesh faces f))

For example, a reference like [4/4/4] indicates that the face uses vertex number 4, UV
coordinate number 4, and normal number 4. The s parameter sets normal smoothing; when
it is off, no smoothing is applied.

The Godot Shaders Bible. 20


Introduction to Mesh Composition.

Once you define the faces, save the changes and return to Godot. By right-clicking on
custom_Quad.obj in the FileSystem and selecting Reimport, the object will update. You can
now assign the resulting Mesh to the MeshInstance3D node without any issues.

•••

(1.2.l The custom_Quad.obj has been assigned as a mesh in the node)

With this, you have completed the basic definition of a QuadMesh using an .obj file, setting
up vertices, UV coordinates, normals, and the faces that form the geometry. In later chapters,
you’ll explore how to manipulate this data further, how it integrates with materials and
different coordinate spaces, and how you can leverage it to create visually appealing effects
in your 2D and 3D scenes.

1.3 Introduction to Spaces.

Which came first: time or space?

When working with shaders, one of the most challenging concepts for graphics developers is
managing spaces. These define how fundamental rendering elements — such as position,
direction, and movement — are represented and transformed within a virtual world. Each
space serves a specific purpose, and understanding their differences is crucial for developing
advanced visual effects and complex mechanics.

The Godot Shaders Bible. 21


Introduction to Mesh Composition.

Some of the most common spaces in computer graphics include:

Chevron-Circle-Right uv space (or texture space). Chevron-Circle-Right View space.


Chevron-Circle-Right Object space (or local space). Chevron-Circle-Right Clip space.
Chevron-Circle-Right World space. Chevron-Circle-Right Screen space.
Chevron-Circle-Right Tangent space.

Before defining each of these spaces, let’s go through a practical exercise to help you
understand their nature using a real-world analogy. Imagine your position in the world. Right
now, you are on planet Earth, meaning your location is relative to it. If you remain completely
still, your position relative to Earth does not change. Even though the planet moves through
the Milky Way, your position remains the same in relation to Earth.

However, if we wanted to calculate our position relative to the galaxy, the situation would
change. In this case, considering Earth’s movement through space, our position within the
Milky Way would vary over time. This introduces changes in position values, where the XYZ
components would be in constant update, reflecting our movement on a larger scale.

This concept of relative reference is key to understanding how different spaces work in
computer graphics. Depending on the reference system used, the same position can be
expressed differently, directly affecting how we interpret and manipulate data within a
shader.

This same principle applies to the different spaces mentioned earlier. For example, object
space describes the position of a mesh’s vertices relative to the object itself, meaning its pivot
point or center of mass. If you move the object within the world, its vertices shift accordingly.
However, their relative position to the object’s center remains unchanged — just like when
we previously defined vertices in a .obj file, where each position was expressed in relation to
the object itself rather than its surroundings.

Existential question: The atoms in our body move with us, which might lead us to think they
exist in object space. But relative to what? Is there an absolute center in our body? How do
atoms ”know” where they should be at all times? These questions reflect the relative nature
of spaces in computer graphics: the position of a point only has meaning within a specific
frame of reference.

The Godot Shaders Bible. 22


Introduction to Mesh Composition.

Continuing with our analogies, world space describes positions relative to the world itself. In
a software like Godot, this space is defined within the viewport, which represents the game
world. It doesn’t matter whether you’re working in a 2D or 3D environment — the grid visible
in the editor view corresponds to this global space.

On the other hand, UV space refers to the coordinates used to map textures onto a surface.
However, before fully understanding this concept, we first need to define what a texture is.
From a graphical perspective, a texture is a collection of pixels arranged in a two-dimensional
structure, forming an image that can be applied to an object.

From a mathematical perspective, a texture is defined as:

𝑇 (𝑢, 𝑣) → 𝑅𝑛

(1.3.a)

Where 𝑇 represents the texture function, 𝑢𝑣 are its coordinates, and 𝑅 𝑛 defines the output
space, meaning the number of color components. For example, if 𝑛 = 3, the texture could
be a .jpg image, which contains three color channels RGB. In contrast, if 𝑛 = 4, we could be
referring to a .png image, which includes a fourth channel RGBA for transparency.

From a programmatic perspective, a texture is simply a data structure organized as a list of


values. Each element in this list represents a pixel, containing both color information and a
specific position within the image.

The Godot Shaders Bible. 23


Introduction to Mesh Composition.

To better visualize this, let’s consider a 4×4 pixel texture:

•••

(1.3.b Texture data visualization)

If we examine the reference above, we can see that on the left, there is a dictionary-style
representation of a list called image_data_4x4. In this structure, each entry stores both the
pixel’s position and its color value in RGB format. On the right, we find the visual representation
of this data: a texture composed of 16 texels (or pixels in Photoshop), each assigned to a
specific color.

It’s important to note that, while we can conceptually represent a texture as a dictionary,
this approach is neither memory-efficient nor performance-friendly. However, it serves
as a useful model for understanding the nature of a texture and its relationship with UV
coordinates. In practical applications, textures in computer graphics are typically stored as
one-dimensional arrays of numerical values, optimized for GPU processing. Each texel within
the texture is accessed using its UV coordinates, enabling operations such as sampling,
interpolation, and filtering.

The Godot Shaders Bible. 24


Introduction to Mesh Composition.

UV coordinates are defined in a normalized space, where the origin is located at (0.0𝑢 , 0.0𝑣 )
and the opposite corner at (1.0𝑢 , 1.0𝑣 ). This coordinate system allows mapping the infor-
mation stored in a texture, similar to the representation shown in Reference 1.3.b.

•••

(1.3.c Vertices and UV coordinates)

Since UV space is relative to each vertex of the object, moving the model in world space
does not affect its UV coordinates. This ensures that the texture remains correctly projected
onto the surface. Additionally, UV coordinates are defined using floating-point values,
providing high precision in texture mapping. This characteristic is crucial for achieving high
texel density, allowing you to add sharp details to characters and objects without losing
resolution.

1.4 Configuring a Unique Object.

During this chapter, you will complete a hands-on exercise in Godot to better understand how
world space functions and its relationship with the internal variables NODE_POSITION_WORLD
and MODEL_MATRIX.

To streamline the implementation process, follow these steps to set up your project:

1 Inside your project folder, navigate to chapter_01 and create a new folder named
spaces.
2 Inside spaces, add another folder named world_space.

The Godot Shaders Bible. 25


Introduction to Mesh Composition.

3 Finally, within world_space, organize the content by creating the following subfolders:
a materials.

b shaders.

c meshes.

d textures.

If you have followed these steps correctly, your project structure should look like this:

•••

(1.4.a The shader folder has been included inside materials)

To illustrate the usefulness of global coordinates, we will complete the following exercise:
Using a shader, we will modify the scale and color of a 3D object’s texture based on its position
in world space. This will allow us to observe how the same object dynamically changes as
it moves through the scene. As a result, if you duplicate the object and move it across the
environment, each instance will display unique variations in its appearance without manually
modifying its material.

Assuming you have already downloaded the supporting resources for this book, we will use
the tree_mesh.fbx model during this exercise. Import this model into the meshes folder
that we created earlier. Additionally, make sure to include the checker_tex and tree_tex

The Godot Shaders Bible. 26


Introduction to Mesh Composition.

textures inside the textures folder. These assets will be essential for visualizing the effects
applied through the shader.

Note

Remember that this book includes a downloadable package containing all the necessary files
to follow the exercises step by step. You can download it directly from: https://jettelly.com/
store/the-godot-shaders-bible

It’s important to mention that, in addition to the model and textures, we will also need a
material and a shader. In the shader, we will implement functions in both the vertex stage
vertex() and the fragment stage fragment() to modify both the color and the scale of the
object. To create these resources, right-click on the materials folder and select:

Chevron-Circle-Right Create New > Resource > ShaderMaterial.

For practical reasons, name this material world_space.

Repeating part of the previous process, navigate to the shaders folder, right-click, and select:

Chevron-Circle-Right Create New > Resource > Shader.

For the shader, use the same name as the material (world_space). This will make it easier
to identify the connection between the two. If everything has been set up correctly, your
project structure should look like this:

The Godot Shaders Bible. 27


Introduction to Mesh Composition.

•••

(1.4.b The downloadable files have been included in the project)

Next, we will begin the implementation process. To do this, we need a single instance of the
tree_mesh object.

Follow these steps to create it:

1 Locate the file tree_mesh.fbx in the FileSystem panel.


2 Right-click on it and select New Inherited Scene.

This action allows you to create a scene based on the original file, maintaining an inheritance
relationship. This makes it easier to apply future modifications without altering the base
model.

By doing this, a new scene will open in the Scene window, containing the following elements:

Chevron-Circle-Right A Node3D node named tree_mesh.


Chevron-Circle-Right A MeshInstance3D node named geo_tree_013.

The Godot Shaders Bible. 28


Introduction to Mesh Composition.

To demonstrate the use of global coordinates, we need to create multiple instances of


tree_mesh at different positions within the 3D scene.

However, when viewing the object in the Scene window, you may notice it appears highlighted
in yellow. This indicates that the node is locked due to inheritance, preventing direct editing.

In fact, if you attempt to rename the node by pressing F2 (Rename), you will see the following
message on the screen:

•••

(1.4.c The node is not editable)

To use and customize the node according to your needs, follow these steps:

1 Select the node in the Scene window.


2 In the Inspector panel, locate the mesh property.
3 Click Make Unique to unlink the inherited resource and convert it into an independent
instance.
4 Save the new resource in the meshes folder of your project.

5 To make it easier to identify, rename it tree.

The Godot Shaders Bible. 29


Introduction to Mesh Composition.

By doing this, you will have an independent object that you can modify freely without affecting
the original resource.

•••

(1.4.d The object has been marked as unique)

If you have followed all the steps correctly, you should now find a new independent file inside
the meshes folder. This file can be used flexibly throughout the exercise without affecting the
original resource.

•••

(1.4.e A unique object has been saved)

The Godot Shaders Bible. 30


Introduction to Mesh Composition.

In this step, you need to create a new scene that contains multiple instances of the tree
object. Therefore, navigate to the Scene window and start a new scene by selecting:

Chevron-Circle-Right Menu: Scene > New Scene.

Alternatively, you can use the keyboard shortcut Ctrl + N. Since we are working with a 3D
object, select 3D Scene as the scene type.

Note

According to the Godot 4.3 documentation, Node3D is the base node for 3D objects, providing
essential transformation and visibility properties. All other 3D nodes inherit from it. In a 3D
project, you should use Node3D as the parent node whenever you need to move, scale, rotate,
show, or hide its child elements in an organized and efficient manner. For more information,
visit the following link: https://docs.godotengine.org/en/stable/classes/class_node3d.html

Next, add a MeshInstance3D as a child node by following these steps:

1 Select the previously created Node3D in the Scene window.


2 Right-click on it and choose Add Child Node.
3 In the node list, search for MeshInstance3D and add it to the scene.

While we could use a MultiMeshInstance3D node along with a MultiMesh resource to opti-
mize the creation of multiple instances, it is not necessary in this case. Since we will only use
a few copies of the tree object to demonstrate the utility of global coordinates, a MeshIn-
stance3D node is sufficient and appropriate. This is because it allows you to directly assign
a Mesh resource.

Now, simply select the tree object and assign it to the Mesh property of the MeshInstance3D
node, as shown below:

The Godot Shaders Bible. 31


Introduction to Mesh Composition.

•••

(1.4.f The tree object has been assigned into the Mesh property)

With this last step, you have completed the initial setup needed to begin developing your
shader. Now, follow these steps:

1 Assign the world_space shader to the material.


2 Apply the material to the tree object.

If everything has been done correctly, the tree object should appear white in the scene. This
indicates that the shader has been successfully applied and is now ready for programming.

1.5 Rendering Stages and Pipeline.

To begin this section, locate the world_space shader and open it by double-clicking on it. In
Godot, shaders are structured into processing stages, and two of the most commonly used
methods are:

The Godot Shaders Bible. 32


Introduction to Mesh Composition.

Chevron-Circle-Right vertex(): Responsible for manipulating vertices before rasterization.


Chevron-Circle-Right fragment(): Handles processing each fragment (pixel) in the final stage of rendering.

These methods control a series of GPU processes, including:

Chevron-Circle-Right Vertex transformation.


Chevron-Circle-Right Attribute processing.
Chevron-Circle-Right Projection.
Chevron-Circle-Right Clipping.
Chevron-Circle-Right Screen mapping.
Chevron-Circle-Right Color processing.
Chevron-Circle-Right Transparency calculations.

In fact, when you open a shader in Godot, its initial configuration typically follows this structure:

•••
shader_type spatial;

void vertex() {
// Called for every vertex the material is visible on.
}

void fragment() {
// Called for every pixel the material is visible on.
}

//void light() {
// Called for every pixel for every light affecting the material.
// Uncomment to replace the default light processing function ...
//}

In the previous code, you may notice the presence of a third method called light(), which
is commented out by default. This method executes once per light source that affects the
object. However, in this section, we will focus only on the vertex() and fragment() methods,
leaving the analysis of light() for a later discussion.

The Godot Shaders Bible. 33


Introduction to Mesh Composition.

To understand how the vertex() and fragment() methods work, it’s essential to grasp the
nature of the Render Pipeline. An analogy I like to use to explain this concept is the pipes in
Super Mario.

In Section 1.2 of this book, we created a QuadMesh using coordinates defined in a .obj file.
As you may have noticed, all shapes — whether 2D, 3D, or even colors — initially exist only
as data in memory. This leads us to a key question: How are these data transformed into
visible, colorful objects on the screen? The answer lies in the Render Pipeline.

We can think of this concept as a chain process, where a polygonal object travels through
different stages until it is finally rendered on screen. It’s as if the object moves through a
series of pipes, gradually transforming step by step until it reaches its final destination: your
computer screen.

•••

(1.5.a Data being transformed into a three-dimensional Cube)

Godot uses a unified Render Pipeline, unlike Unity, where each Render Pipeline (URP, HDRP, or
Built-in RP) has its own characteristics and rendering modes. In Unity, the choice of Render
Pipeline directly affects material properties, light sources, color processing, and overall
graphic operations that determine the appearance and optimization of objects on screen.

The Godot Shaders Bible. 34


Introduction to Mesh Composition.

In Godot, the rendering system is based on OpenGL and includes different rendering modes.
These modes were introduced at the beginning of the book when we set up our project. The
available rendering modes are:

Chevron-Circle-Right Forward+.
Chevron-Circle-Right Forward Mobile.
Chevron-Circle-Right Compatibility.

Starting with Forward+, if we refer to the official Godot documentation, we find the following
information:

Quote-left
This is a Forward renderer that uses a clustered approach to lighting.
Clustered lighting uses a compute shader to group lights into a 3D frustum
aligned grid. Then, at render time, pixels can look up what lights affect the
grid cell they are in and only run light calculations for lights that might affect
that pixel. This approach can greatly speed up rendering performance on
desktop hardware, but is substantially less efficient on mobile.
https://docs.godotengine.org/en/stable/tutorials/rendering/renderers.html

Quote-Right
The term clustered approach refers to an optimization method used in lighting calculations.
Instead of computing lighting individually for each pixel or object in the scene, this method
divides the 3D space into clusters (small spatial regions). During rendering, each pixel
only queries the lights affecting its specific cluster, avoiding unnecessary calculations and
improving performance — especially in scenes with complex dynamic lighting.

Thanks to this optimization, Forward+ is primarily used in PC projects, where high graphical
quality and multiple light sources are required. In contrast, other rendering modes in Godot
are designed for different needs:

Chevron-Circle-Right Forward Mobile: Optimized for mobile devices, prioritizing performance over graphical
quality.
Chevron-Circle-Right Compatibility: Also based on Forward Rendering, but intended for older GPUs that do
not support Vulkan.

The Godot Shaders Bible. 35


Introduction to Mesh Composition.

Note

Given the specifications and limitations of OpenGL, we will not cover all aspects of Godot’s
rendering system in detail. If you would like to learn more about the different rendering modes
and their features, you can visit the following link: https://docs.godotengine.org/en/stable/
contributing/development/core_and_modules/internal_rendering_architecture.html

Now, what is the relationship between these rendering modes and the vertex() /
fragment() methods? To understand this better, let’s examine the following reference:

•••

(1.5.b Some of the most important processes of the Render Pipeline)

The Input Assembler is the component responsible for receiving and organizing the data of
objects (meshes) present in the scene. This process includes information such as:

Chevron-Circle-Right Vertices. Chevron-Circle-Right Tangents.


Chevron-Circle-Right Indices. Chevron-Circle-Right UV Coordinates.
Chevron-Circle-Right Normals. Chevron-Circle-Right Other geometric attributes.

The Godot Shaders Bible. 36


Introduction to Mesh Composition.

All this information is stored as numerical values and organized in a structured manner so
that the Render Pipeline can correctly interpret it. This structured data is then processed to
ultimately generate the 3D objects displayed on screen.

The choice of rendering mode determines which rendering technique will be used in the
project (Forward+, Forward Mobile, or Compatibility). This selection has a direct impact on
both performance and visual quality, as it defines how graphics are processed and which ren-
dering API will be used (Vulkan, Direct3D, OpenGL ES, etc.). Depending on the selected mode,
certain graphical effects and optimizations will be either available or restricted, influencing
how shaders interact with the scene.

The vertex processing stage, represented by the vertex() method, is one of the fundamental
phases in a shader. During this stage, the data retrieved from the Input Assembler is
transformed through different coordinate spaces, following this sequence:

1 Object Space: The local space of the model, where vertices are defined relative to their
own pivot.
2 World Space: Transforms vertices based on the object’s position within the scene.

3 View Space: Adjusts vertices according to the camera’s perspective, determining how

they appear to the observer.


4 Clip Space: Projects vertices into a homogeneous coordinate system, preparing them

for the next phase.


5 Normalized Device Coordinates: Performs division by W, normalizing coordinates to a

range between -1.0 and 1.0 on each axis.


6 Screen Space: Assigns vertices to pixel coordinates on the screen, where the object is
rasterized and rendered.

This process ensures that the 3D scene is correctly projected onto a 2D surface (screen),
converting polygonal models into visible images on screen.

Once the vertices are transformed into screen space, the next step in the Render Pipeline is
rasterization. During this stage, the geometric primitives — such as triangles, lines, or points
— are converted into fragments (potential pixels) on the screen. This process determines
which parts of the object will be visible and how they will be distributed in the final image.

The Godot Shaders Bible. 37


Introduction to Mesh Composition.

Additionally, during this stage, Depth Testing is applied — a key mechanism that determines
the visibility of fragments based on their relative distance from the camera. This test com-
pares the depth of each fragment with the values stored in the Z-buffer (or Depth Buffer),
discarding those that are occluded by other objects. This ensures that only the closest visible
surfaces are rendered, preventing overlapping artifacts and improving scene realism.

After rasterization, each fragment enters the fragment stage fragment(), where its final color
is computed before being displayed as a pixel on the screen. In this phase, key visual effects
are determined, including lighting, textures, transparency, and reflections. Additionally, each
fragment is processed independently in the GPU, enabling the efficient execution of complex
calculations while maintaining high performance.

Note

It’s important to highlight that the light() method does not replace the fragment() method;
rather, it is executed afterward, but only in the presence of light source that affect the current
fragment. We’ll explore its behaviour in more detail throughout Chapter 2.

Finally, the Blending and Post-Processing stages are applied, incorporating visual effects
before rendering the image on screen. At this stage, elements such as transparency, color
blending, bloom, anti-aliasing, and color correction are processed. These effects enhance
visual quality and optimize the final appearance of the scene, ensuring a more polished and
immersive rendering.

Now that we understand the relationship between the different rendering modes and shader
methods, we will proceed with the implementation of world_space.

1.6 Working with World Space Coordinates.

The directive shader_type spatial is a key attribute of our shader as it specifies that the
shader will be applied to 3D objects.

The Godot Shaders Bible. 38


Introduction to Mesh Composition.

Note

If we wanted to create shaders for UI elements or sprites, we would need to use the shader_type
canvas_item directive. For particle effects, on the other hand, we would use shader_type par-
ticles. You can find more information about this at the following link: https://docs.godotengine.
org/en/stable/tutorials/shaders/shader_reference/index.html

Since our tree object currently appears white by default, we will start by adding a texture to
our shader. To do this, add the following line of code:

•••
shader_type spatial;

uniform sampler2D _MainTex : source_color;

void vertex()
{
// Called for every vertex the material is visible on.
}

In the previous code, a new sampler2D variable named _MainTex was declared, which
includes the hint source_color. This variable is defined globally, meaning it can be used in
any method within the shader.

A sampler2D refers to a two-dimensional texture, which can be used within the shader. The
keyword uniform indicates that the texture will be assigned externally from the material. On
the other hand, source_color specifies that the texture functions as an albedo map (base
color), meaning it determines the main color of the material.

Note

source_color is just one of the many hints you can use in a shader. For example, you can use
hint_normal to define a normal map. If you want to see the complete list of available hints,
you can check the following link: https://docs.godotengine.org/en/latest/tutorials/shaders/
shader_reference/shading_language.html

The Godot Shaders Bible. 39


Introduction to Mesh Composition.

In fact, if you save the changes and select the material from the FileSystem window, you
will see a new property called ”Main Tex” in the Shader Parameters section. This property
directly references the _MainTex variable we just declared in the shader, allowing you to
assign a texture from within the Godot editor.

•••

(1.6.a A new texture property has been declared)

However, if you assign any texture to the property, you’ll notice that the tree object in the
scene does not change color. This happens because we haven’t yet declared an RGB vector
to process the texture color for each fragment.

Since we need to compute the texture color for each pixel, we must go to the fragment stage
of our shader and add the following lines of code:

The Godot Shaders Bible. 40


Introduction to Mesh Composition.

•••
void fragment()
{
vec3 albedo = texture(_MainTex, UV).rgb;
ALBEDO = albedo;
}

From the previous code, there are some key aspects we need to consider for interpreting the
operation we are performing:

Chevron-Circle-Right The texture() function: This function returns an RGBA vector and takes two arguments:
A sampler2D variable (the texture), and a 2D vector (the UV coordinates). Essentially,
texture() samples the texture _MainTex using the UV coordinates, retrieving the cor-
responding color, which is then stored in the RGB vector albedo.
Chevron-Circle-Right Assignment to ALBEDO: The retrieved texture color is assigned to the ALBEDO variable, a
predefined Godot variable representing the base color of the fragment’s surface.
Chevron-Circle-Right Texture Projection: Since UV coordinates are calculated based on vertex positions in
object space, the texture should be correctly projected onto the object within the scene.

•••

(1.6.b The tree object has color now)

The Godot Shaders Bible. 41


Introduction to Mesh Composition.

Note

Remember that texels are the color points within a texture—the smallest color units that make
up the image. In contrast, pixels are the color points displayed on your computer screen. Now,
why don’t we see clearly defined points in the texture applied to the tree object? This happens
because Godot automatically applies linear interpolation, smoothing the transition between
texels instead of displaying sharp edges.

Let’s do a simple exercise now: we’ll modify the texture color based on the tree’s global
position. To achieve this, we’ll use the internal variable NODE_POSITION_WORLD, which refers
to the object’s Transform Position in the scene. Since this variable is directly linked to the
object’s position, we can assume that its return value is a three-dimensional vector, covering
the usable area within the scene.

This approach presents a potential issue: What happens if the global position of the tree
object is (100𝑥 , 234𝑦 , 134𝑧 ) or any other value greater than 1.0? In a shader, colors are
defined within a range of 0.0 to 1.0. If position values exceed this range, the result could be
an entirely white object, as color values beyond 1.0 are interpreted as maximum intensity.

To prevent this issue, we will create a function that limits the return values, ensuring the global
position of the object is normalized within the appropriate range. However, we must make
sure to declare this function before the vertex() method. Why? Because the GPU reads
shaders sequentially, from top to bottom. If we need to use this function in both the vertex
stage and the fragment stage, it must be declared at the beginning of the shader code so
that both methods can access it.

The Godot Shaders Bible. 42


Introduction to Mesh Composition.

•••
uniform sampler2D _MainTex : source_color;

vec3 hash33(vec3 p)
{
p = fract(p * 0.4567);
p += dot(p, p.yzx + 1.3456);
p.x *= p.x;
p.y *= p.y;
p.z *= p.z;
return fract(p);
}

void vertex()

What happens in the hash33() method? Let’s analyze its structure from a mathematical
perspective. Given two three-dimensional vectors — one as input and one as output, if
𝑝 = (4.23, 5.52, 3.74):

For the first line of code,

𝑓𝑟𝑎𝑐𝑡(𝑝 ∗ 0.4567) = (0.931841, 0.520984, 0.708058)

(1.6.c)

If we multiply the first component of the vector 𝑝 by 0.4567, we get 𝑝𝑥 ∗ 0.4567 = 1.931841.
However, the fract() function is applied in the operation, which returns only the fractional
part of a number, discarding the integer part. This means 𝑓𝑟𝑎𝑐𝑡(1.931841) = 0.931841.
This applies to all components of the vector 𝑝.

The Godot Shaders Bible. 43


Introduction to Mesh Composition.

Its definition for a floating-point value is:

•••
float fract (float v)
{
return v - floor(v);
}

Then,

𝑝 ⋅ (𝑝𝑦𝑧𝑥 + 1.3456) = ∼ 4.42184278

(1.6.d)

This value is added to each component of the vector 𝑝. Therefore,

𝑝 = (5.353684, 4.942827, 5.129901)

(1.6.e)

Squaring each component,

𝑝𝑥 ∗ 𝑝𝑥 = 28.661930
𝑝𝑦 ∗ 𝑝𝑦 = 24.431537
𝑝𝑧 ∗ 𝑝𝑧 = 26.315882

(1.6.f)

Finally,

𝑓𝑟𝑎𝑐𝑡(𝑝) = (0.661930, 0.431537, 0.315882)

(1.6.g)

The Godot Shaders Bible. 44


Introduction to Mesh Composition.

It is worth noting that the previous method could have any name we choose. However, it has
been named hash33 for the following reasons:

Chevron-Circle-Right It returns a three-dimensional vector as output (3).


Chevron-Circle-Right It generates pseudo-random values (hash).
Chevron-Circle-Right It receives a three-dimensional vector as an argument (3).

This type of method is commonly used to generate visual ”noise.” However, when com-
bined with other functions, it can produce more interesting effects, such as the one we will
implement next.

•••
void fragment()
{
vec3 random_color_ws = hash33(NODE_POSITION_WORLD);
vec3 albedo = texture(_MainTex, UV).rgb;
ALBEDO = random_color_ws;
}

As we can see, a new three-dimensional vector named random_color_ws has been declared
and initialized with the output of the hash33() method, using the global position of the tree
object as input. Subsequently, this RGB vector is assigned to the ALBEDO variable, setting the
base color of the shader based on the object’s position in world space.

If you save the changes and move the object across the scene, you will notice that its color
changes dynamically depending on its position.

The Godot Shaders Bible. 45


Introduction to Mesh Composition.

•••

(1.6.h The object’s color changes every time it changes its position in the world)

Due to the resolution of the object’s position, the color may flicker multiple times as you move
it. This happens because position values change continuously, causing multiple variations in
color.

In some cases, this effect may be a problem. To fix this, we can limit the color change so that
it only updates per unit of movement. Implementing this solution is simple: we extend the
color calculation operation and include the floor() function, which, by definition:

Quote-left
It returns largest integer not greater than a scalar or each vector component.
https://registry.khronos.org/OpenGL-Refpages/gl4/html/floor.xhtml

Quote-Right
Its definition for a three-dimensional vector is::

The Godot Shaders Bible. 46


Introduction to Mesh Composition.

•••
vec3 floor(vec3 v)
{
vec3 rv;
int i;

for (i=0; i<3; i++) {


rv[i] = v[i] - fract(v[i]);
}
return rv;
}

The floor() function removes the decimal part of a value, returning only its integer com-
ponent. In practice, this means the object’s position will be rounded down, ensuring that its
color only changes when it crosses a full metric unit in the scene.

For example, 𝑝 = (1.165, 0.0, 1.712)

𝑓𝑙𝑜𝑜𝑟(𝑝) = (1, 0, 1)

(1.6.i)

•••
void fragment()
{
vec3 random_color_ws = hash33(floor(NODE_POSITION_WORLD));
vec3 albedo = texture(_MainTex, UV).rgb;
albedo += random_color_ws * 0.15;
ALBEDO = albedo;
}

If you save the changes and move the tree object within the scene again, you’ll notice that
its color only changes when it crosses from one metric unit to another on the grid. This
prevents constant flickering and allows for better control over color variations based on the
object’s global position.

The Godot Shaders Bible. 47


Introduction to Mesh Composition.

1.7 Introduction to Tangent Space.

Have you ever wondered how those eye-catching depth effects are created in the eyes
of a 3D character? One common approach is to use multiple mesh layers — one for the
cornea, another for the pupil, and so on — stacking them with transparency to simulate
depth. While this method works, it’s not always the most efficient. Instead, you can achieve a
similar result by leveraging the interaction between the view direction (from the camera)
and the vertex coordinates of the object. This technique simplifies the setup and can be
more performance-friendly.

To better understand how this works, take a look at the following image:

•••

(1.7.a The right eye displays depth)

At first glance, you might not notice it clearly; however, the eye on the right conveys a subtle
sense of depth that responds to the direction from which you view the scene. As you move the
camera, you’ll notice that the character’s pupil exhibits a certain level of three-dimensionality,
even though it’s rendered using only a two-dimensional texture.

To achieve this kind of effect, you need to rely on a specific coordinate system known as
tangent space, which is defined by three orthogonal vectors: the tangent, bitangent, and
normal.

As shown earlier in Figure 1.1.a (at the beginning of this chapter), tangent space can be
visualized through these three vectors that define the orientation of a vertex, almost as if

The Godot Shaders Bible. 48


Introduction to Mesh Composition.

each vertex had its own transformation gizmo. However, tangent space isn’t limited to the
vertex stage — you can also define it per fragment, allowing you to perform transformations
at the pixel level.

Before diving into the implementation, it’s important to quickly analyze the topology of the 3D
object you’ll be working with. You’ll focus on some of its most important properties, particularly
its UV coordinates. For this purpose, you’ll use the anime_eye.fbx model included in the
downloadable files for this book, located in the following directory:

Chevron-Circle-Right Assets > chapter_01 > spaces > tangent_space > meshes

If you examine the object closely, you’ll see that it’s made up of three distinct layers:

Chevron-Circle-Right Eye.
Chevron-Circle-Right Skin.
Chevron-Circle-Right Pupil.

You’ll apply the shader to the Pupil layer. Therefore, your first step will be to review its UV
coordinates. Then, by making a comparison, you’ll explore how tangent space can help you
generate visual effects like displacement based on the view direction.

•••

(1.7.b Pupil vertices within the UV space)

The Godot Shaders Bible. 49


Introduction to Mesh Composition.

Although the pupil has a geometric shape similar to an oval, its UV coordinates, in terms
of mapping, are contained within a square space. This detail is important because UV
coordinates allow movement along only two axes: U and V. In contrast, tangent space
introduces a third dimension, enabling you to simulate depth effects directly on the texture.

By applying displacement in tangent space, you can modify the appearance of the texels
around the pupil area. As a result, the perceived visual field expands, creating the illusion of
three-dimensionality without needing to alter the model’s geometry.

•••

(1.7.c The image on the right shows a slight UV displacement)

In Godot, textures are set to Repeat mode by default. This means that if you apply a significant
offset to simulate depth, the edges of the texture may begin to repeat across the 3D model
— in this case, the pupil. This repetition can negatively impact the visual result, breaking the
illusion of depth and natural movement you’re aiming to achieve in the eye.

To avoid this unwanted effect, it’s best to switch the texture’s wrap mode to Clamp within
the shader settings. With Clamp, when the UV coordinates exceed the texture’s boundaries,
the nearest edge color is extended instead of repeating the image. This results in smoother,
more natural transitions in the displaced areas.

The Godot Shaders Bible. 50


Introduction to Mesh Composition.

Additionally, you can limit the intensity of the displacement by using a mask that defines
the area of influence. This mask can be a grayscale texture specifically designed to restrict
where the effect is applied, leaving the rest of the surface untouched.

•••

(1.7.d Edge pixels are repeated to ensure accurate results)

1.8 Implementing Tangent Space in Our Shader.

In this section, you’ll complete a new hands-on exercise to deepen your understanding of
how tangent space works and how it relates to the internal variables TANGENT, NORMAL and
BINORMAL.

As a first step, you’ll organize the project to make it easier to implement the various functions
you’ll be working with. Follow these steps:

1 Inside your project, navigate to chapter_01 > spaces and create a new folder named
tangent_space.
2 Inside tangent_space, organize the content by creating the following subfolders:
a materials.

b shaders.

c meshes.

d textures.

The Godot Shaders Bible. 51


Introduction to Mesh Composition.

If you’ve followed the steps correctly, your project structure should look like this:

•••

(1.8.a The shaders folder has been included inside materials)

To demonstrate the utility of tangent space, you’ll work through the following exercise: using
a shader, you’ll transform the view vector from view space into tangent space. To accomplish
this, you’ll define a custom function called view_to_tangent(), which will take four vectors
as arguments: the tangent, the bitangent, the normal, and the view vector.

This function will calculate the dot product between the view vector and each of the three
vectors that form the tangent space basis. The result will be a new vector expressed in local
tangent space coordinates. You can then use this vector to create camera-dependent visual
effects, such as displacement, shading, or simulated depth.

Note

The files you’ll need for this exercise are available in the downloadable package linked to your
account at: https://jettelly.com/store/the-godot-shaders-bible

The Godot Shaders Bible. 52


Introduction to Mesh Composition.

To start the development process, you’ll first create a new material and a shader. Begin by
right-clicking on the materials folder and selecting:

Chevron-Circle-Right Create New > Resource > ShaderMaterial

For practical reasons, name this material anime_eyes. Then, repeat the same process in
the shaders folder: right-click and select:

Chevron-Circle-Right Create New > Resource > Shader

Name the shader anime_eyes as well to maintain a clear connection between the two files
and make them easier to identify. If everything was done correctly, your project structure
should now look like this:

•••

(1.8.b The downloadable files have been added to the project)

Now that you have the necessary files, locate the anime_eye.fbx object inside the meshes
folder. Right-click on it and select New Inherited Scene. This action will allow you to create a
new scene based on the original model, preserving its imported structure and properties.

The Godot Shaders Bible. 53


Introduction to Mesh Composition.

Inside the Scene panel, you’ll find the three layers associated with the 3D model: eye, skin,
and pupil. Assign the anime_skin material to both the eye and skin layers. Then, apply the
anime_eyes material to the pupil layer, as this is where you’ll develop the effect.

Make sure you have already linked the appropriate shader to the material by using the
Shader property.

Note

To assign a shader to a material: 1) select the material, 2) double-click it to open its properties,
3) assign the shader through the Shader property.

If everything was set up correctly, your scene should now look like this:

•••

(1.8.c The appropriate materials have been assigned to each 3D layer)

Before moving on, there are a couple of important details you should check regarding the
anime_eye object in your scene. Based on the current lighting setup:

1 The material applied to the skin and eye is not being affected by light.
2 Meanwhile, the material on the pupil is reacting to light, giving it a grayish tint — when
in fact, it should appear completely white by default.

The Godot Shaders Bible. 54


Introduction to Mesh Composition.

Additionally, you’ll notice that the pupil doesn’t display any visible texture yet. This is expected
since you haven’t declared a sampler2D in the shader to link an external texture.

Next, you’ll implement these initial elements, as shown in the following code snippet:

•••
shader_type spatial;
render_mode unshaded;

uniform sampler2D _MainTex : source_color;

void vertex()
{
// Called for every vertex the material is visible on.
}

void fragment()
{
vec3 albedo = texture(_MainTex, UV).rgb;
ALBEDO = albedo;
}

If you look closely at the previous code block, you’ll notice that the second line sets the
render_mode to unshaded. According to the official Godot documentation:

Quote-left
The result is only albedo. No lighting or shading is applied to the material,
which speeds up rendering.
https://docs.godotengine.org/en/stable/tutorials/shaders/shader_reference/spatial_

shader.html

Quote-Right
In practical terms, this means that the unshaded render mode disables all lighting calcu-
lations. This can be especially beneficial when optimizing performance — for example, in
mobile games targeting mid to low-end devices.

However, this setting also means that any lighting or depth effects must be manually simu-
lated within the shader. In this particular case, that limitation isn’t an issue, since your goal

The Godot Shaders Bible. 55


Introduction to Mesh Composition.

is to clearly visualize the displacement effect applied in tangent space. That’s why using
unshaded here is completely appropriate: it allows you to focus solely on the visual behavior
you’re trying to achieve without distractions from additional lighting computations. This not
only simplifies implementation, but also helps you better understand the concept you’re
exploring.

Note

In Godot, spatial shaders allow you to configure multiple render modes. One of these is
cull_disabled, which enables rendering on both the inside and outside faces of a 3D object.
For more details about the different available modes, you can refer to the official documentation
here: https://docs.godotengine.org/en/stable/tutorials/shaders/shader_reference/spatial_
shader.html

With this small setup complete, you can now go to the anime_eyes material and assign a
texture to the Main Tex property. For this exercise, you can use any of the following textures
included in the project:

Chevron-Circle-Right eye_colorful_color.
Chevron-Circle-Right eye_green_color.
Chevron-Circle-Right eye_purple_color.

•••

(1.8.d Main Tex set to eye_colorful_color)

If everything has been set up correctly, the pupil object should now display the assigned
texture. However, at this stage, no displacement effect is visible yet. If you observe the albedo
RGB vector, you’ll notice that the texture is simply being projected onto the surface using the
basic UV coordinates.

The Godot Shaders Bible. 56


Introduction to Mesh Composition.

To create a displacement effect, you’ll need to complete a few key steps:

Chevron-Circle-Right Define the tangent space for the pupil’s geometry.


Chevron-Circle-Right Transform the camera’s VIEW direction into tangent space by calculating the dot
product with each of the tangent space basis vectors.
Chevron-Circle-Right Distort the UV coordinates using the result of this transformation, creating a new dy-
namic projection that reacts to the viewer’s perspective.

You’ll begin by declaring a method that transforms the view vector into tangent space, as
shown below:

•••
vec3 view_to_tangent(vec3 tangent, vec3 bitangent, vec3 normal, vec3 view)
{
float t = dot(tangent, view);
float b = dot(bitangent, view);
float n = dot(normal, view);
return normalize(vec3(t, b, n));
}

void fragment() { … }

What does this method do? To fully understand how this method works, you first need to
analyze the dot() function. This function, widely used across programming languages,
calculates the dot product between two vectors. In this specific case, it operates on three-
dimensional vectors, and the result is a scalar value, which is then assigned to the variables
t, b, and v based on the corresponding base vector — tangent, bitangent, and normal.

Let’s begin by reviewing its algebraic definition for three-dimensional vectors:

3
𝐴 ⋅ 𝐵 = ∑ 𝐴𝑖 𝐵𝑖
𝑖=1

(1.8.e)

The Godot Shaders Bible. 57


Introduction to Mesh Composition.

Which is the same as,

𝐴 ⋅ 𝐵 = (𝐴1 ∗ 𝐵1 ) + (𝐴2 ∗ 𝐵2 ) + (𝐴3 ∗ 𝐵3 )

(1.8.f)

The equivalent implementation in GLSL can be represented like this:

•••
float dot(vec3 a, vec3 b)
{
return a.x * b.x + a.y * b.y + a.z * b.z;
}

In this context, the view_to_tangent() method projects the view vector (the direction from
the fragment toward the camera) onto the orthonormal basis formed by the tangent,
bitangent, and normal vectors. The result is a new vector expressed in tangent space,
allowing you to work within this coordinate system to apply visual effects such as displace-
ment, shading, or simulated depth.

Now, if you look at the first operation inside the method, both the tangent vector and the
view vector must be expressed in view space. But how can you be sure that both vectors
are indeed in that space?

According to the official Godot documentation, under the Fragment Built-ins section:

Quote-left
VIEW: A normalized vector from the fragment (or vertex) position to the
camera (in view space). It is the same for both perspective and orthogonal
cameras.

Quote-Right
In simple terms, VIEW is a vector that points from the fragment being processed toward the
camera, and it’s expressed in the camera’s reference space. This vector is automatically

The Godot Shaders Bible. 58


Introduction to Mesh Composition.

interpolated from the vertices during the rasterization process, and it’s particularly useful for
effects that depend on the viewpoint, such as the displacement effect you’re implementing.

Additionally, the documentation states:

Quote-left
TANGENT: The tangent coming from the vertex() function, in view space. If
skip_vertex_transform is enabled, it may not be in that space.

Quote-Right
Considering that you haven’t applied any custom transformations to the tangent inside the
vertex() method, you can safely assume that the internal TANGENT variable is already in
view space, just like VIEW.

This leads to a new question: how can you visualize this space? To answer that, let’s take a
closer look at the following visual reference:

•••

(1.8.g Difference between world space (left) and view space (right))

Starting with the image on the left, you can observe that the cube is defined in world space,
meaning its position and orientation are determined relative to the origin point in the Viewport.
In contrast, in the image on the right, the cube is defined in view space. This means its
position and orientation are no longer expressed relative to the world but from the camera’s

The Godot Shaders Bible. 59


Introduction to Mesh Composition.

perspective. As a result, its rotation no longer matches exactly (0𝑥 , 0𝑦 , 0𝑧 ) because its frame
of reference has changed.

The same principle applies to the tangent, bitangent, and normal vectors used in each
fragment within the view_to_tangent() function. All of them must exist within the same
reference space — in this case, view space — for vector operations like the dot product to be
mathematically valid and produce correct results.

Note

From a Technical Artist’s perspective, it’s not strictly necessary to memorize every mathematical
formula when implementing these effects. What matters most is understanding what each
function does and making sure that all vectors involved are expressed in the same space
before combining them in any operation.

Now that you understand how to transform the view direction into tangent space, you can
apply this knowledge to distort the UV coordinates inside the texture() function. This will
allow you to visualize the displacement effect on the pupil’s texture.

To achieve this, you’ll declare a new vector to store the transformation result. Then, you’ll
multiply this vector by a scalar displacement value and apply it directly to the UV coordinates,
as shown below:

•••
vec3 view_to_tangent(vec3 tangent, vec3 bitangent, vec3 normal, vec3 view)
{ … }

void fragment()
{
float offset = 0.5;
vec3 view_tangent_space = view_to_tangent(-TANGENT, BINORMAL, NORMAL,
↪ VIEW);
view_tangent_space *= offset;

vec3 albedo = texture(_MainTex, UV + view_tangent_space.xy).rgb;


ALBEDO = albedo;
}

The Godot Shaders Bible. 60


Introduction to Mesh Composition.

In the previous code, three key actions take place:

1 A scalar value called offset has been declared and set to 0.5. This value is used for
demonstration purposes only, as you’ll later adjust it to a maximum of 0.3 to achieve a
more controlled result.
2 A new three-dimensional vector called view_tangent_space has been declared and

initialized with the result of the view_to_tangent() method that transforms the view
vector into tangent space.
3 The first two components XY of the view_tangent_space vector have been taken and

added to the original UV coordinates inside the texture() function, generating a visual
displacement effect on the texture.

If you pay close attention to the first argument passed to the view_to_tangent() function,
you’ll notice that the TANGENT variable is negated. Although this might seem counterintuitive
at first, there’s a logical reason behind it: Godot uses a right-handed spatial coordinate
system, where the 𝑧-axis points forward. In this system, it’s necessary to invert the tangent
vector’s direction so that the tangent-bitangent-normal (TBN) basis is constructed correctly.
In contrast, engines like Unity use a left-handed system, where the TANGENT vector can be
used without inversion.

By adding the first two components of the view_tangent_space vector to the UV coordinates,
you are projecting the view direction onto the plane defined by the tangent and bitangent
vectors, which are aligned with the UV space. Since this vector represents the direction
toward the camera in tangent space, the visual result is a dynamic texture displacement
that simulates depth on the object’s surface. This technique enhances the visual perception
of depth without altering the model’s actual geometry, achieving a convincing effect at a
low computational cost.

The Godot Shaders Bible. 61


Introduction to Mesh Composition.

•••

(1.8.h The Main Tex texture is set to repeat mode)

If you return to the Viewport, you’ll notice two main things:

1 The pupil now displays an effective depth effect.


2 However, the texture appears repeated, creating an undesirable visual result.

Although one possible solution would be to manually disable the Repeat mode on the
eye_colorful_color texture, in this case, you’ll adopt a more robust approach suited for
shader-controlled environments. Since you are implementing this behavior directly within
the shader code, you’ll use the repeat_disable hint next to the sampler2D declaration for
_MainTex. This will prevent the UV coordinates from automatically wrapping when exceeding
the standard range [0.0 : 1.0].

Additionally, to allow greater flexibility from the editor, you’ll declare a new uniform variable
called _Offset, with an adjustable range between [0.0 : 0.3]. This variable will let you
dynamically modify the displacement intensity directly from the Inspector.

The Godot Shaders Bible. 62


Introduction to Mesh Composition.

The updated shader will look like this:

•••
uniform sampler2D _MainTex : source_color, repeat_disable;
uniform float _Offset : hint_range(0.0, 0.3, 0.05);

void vertex() { … }

vec3 view_to_tangent(vec3 tangent, vec3 bitangent, vec3 normal, vec3 view)


{ … }

void fragment()
{
float offset = _Offset;
vec3 view_tangent_space = view_to_tangent(-TANGENT, BINORMAL, NORMAL,
↪ VIEW);
view_tangent_space *= offset;

vec3 albedo = texture(_MainTex, UV + view_tangent_space.xy).rgb;


ALBEDO = albedo;
}

With a displacement value set to 0.2, the pupil should now look like this:

•••

(1.8.i Depth effect on the pupil)

The Godot Shaders Bible. 63


Introduction to Mesh Composition.

At this point, you could consider the effect complete, as it successfully fulfills its main
purpose. However, even though the displacement creates a sense of depth, the result still
feels somewhat flat due to the lack of visual contrast or layering. For this reason, in the next
section, you’ll spend a few more minutes enhancing the effect by adding additional el-
ements that will enrich the visual perception and increase the overall realism of the final result.

1.9 Linear Interpolation Between Colors.

Up to this point, the displacement effect is working correctly. However, it’s being applied
uniformly across the entire texture, when ideally it should only affect the pupil — the central
circular area.

To fix this behavior, you can simply introduce a mask: a new texture where the area you want
to displace is white, and the rest is black. Why use these colors? Because mathematically,
white represents a value of 1.0, while black represents 0.0. Therefore, by multiplying this value
by your displacement amount, you can achieve gradual control over the effect: it will be fully
applied in the white areas (the pupil) and suppressed in the black areas (the rest of the eye).

To implement this improvement, you’ll extend the shader by adding a new uniform texture
called _Depth, as shown below:

The Godot Shaders Bible. 64


Introduction to Mesh Composition.

•••
uniform sampler2D _MainTex : source_color, repeat_disable;
uniform sampler2D _Depth : source_color, repeat_disable;
uniform float _Offset : hint_range(0.0, 0.3, 0.05);

void vertex() { … }

vec3 view_to_tangent(vec3 tangent, vec3 bitangent, vec3 normal, vec3 view)


{ … }

void fragment()
{
float depth = texture(_Depth, UV).r;
float offset = _Offset * depth;
vec3 view_tangent_space = view_to_tangent(-TANGENT, BINORMAL, NORMAL,
↪ VIEW);
view_tangent_space *= offset;

vec3 albedo = texture(_MainTex, UV + view_tangent_space.xy).rgb;


ALBEDO = albedo;
}

As you can see, a new texture called _Depth has been introduced, acting as a mask to control
the displacement effect. This texture should contain white values in the region you want to
affect — the pupil — and black values in the surrounding areas. Since you’re working with
a grayscale image, the red R channel alone is sufficient to represent the intensity at each
point of the texture.

Inside the fragment() method, you access this value using texture(_Depth, UV).r and
store the result in the depth variable. This value determines how much displacement should
be applied to that specific fragment. You then multiply this value by the global displacement
defined by _Offset, generating a fragment-controlled variation: full displacement over the
pupil and no displacement over the surrounding areas.

In this way, the displacement is no longer applied uniformly across the entire surface but
only in the areas you define through the mask. To see this in action, make sure you assign
the eye_depth texture to the Depth property in the Inspector.

The Godot Shaders Bible. 65


Introduction to Mesh Composition.

•••

(1.9.a The eye_depth texture has been assigned to the Depth property)

Once you’ve correctly assigned the mask, you’ll notice a colored border appearing around
the pupil, visually emphasizing the displacement effect. This color corresponds to the regions
of the texture outside the white area defined by the mask — in other words, outside the pupil
— and it shows how the distortion affects the surrounding texels.

Now is a great time to experiment with additional color layers to complement and enhance
the visual effect. For example, you could introduce extra highlights or decorative details to
reinforce the sense of depth and direction. To achieve this, you’ll add two new global textures
to the shader, as shown below:

•••
uniform sampler2D _MainTex : source_color, repeat_disable;
uniform sampler2D _Depth : source_color, repeat_disable;
uniform sampler2D _Border : source_color, repeat_disable;
uniform sampler2D _Shine : source_color, repeat_disable;
uniform float _Offset : hint_range(0.0, 0.3, 0.05);

void vertex() { … }

The Godot Shaders Bible. 66


Introduction to Mesh Composition.

For the _Border and _Shine textures, you’ll use the eye_border and eye_shine textures
respectively, which must be assigned from the Inspector. Once these textures are applied
and the necessary adjustments are made in the shader, you’ll be able to visualize the effect
directly in the Viewport, allowing for quick iteration and fine-tuning.

With the textures now declared, the next step is to blend them with the base color of the pupil.
To do this, you’ll use linear interpolation through the mix() function. This function, commonly
used in GLSL, allows you to combine two values (in this case, colors or vectors) based on a
third scalar value that acts as the blending factor.

The basic definition of mix() for a three-dimensional vector is as follows:

•••
vec3 mix(vec3 a, vec3 b, float w)
{
return a + w * (b - a);
}

In this context:

Chevron-Circle-Right The first argument, vec3 a, is the original value (for example, the base color from the
albedo texture).
Chevron-Circle-Right The second argument, vec3 b, is the value you want to introduce (for example, the
border or shine).
Chevron-Circle-Right The third argument, float w, is a value between 0.0 and 1.0 that defines how much b
influences a.

When w = 0.0, the result is completely a. When w = 1.0, the result is completely b.

For any value in between, you get a smooth transition between the two. This allows you to
combine additional visual effects in a controlled and coherent way across the surface of the
eye.

The Godot Shaders Bible. 67


Introduction to Mesh Composition.

Now, let’s continue by extending the shader, starting with the implementation of the _Border
texture, as shown below:

•••
void fragment()
{
float depth = texture(_Depth, UV).r;
float offset = _Offset * depth;
vec3 view_tangent_space = view_to_tangent(-TANGENT, BINORMAL, NORMAL,
↪ VIEW);
view_tangent_space *= offset;

vec3 albedo = texture(_MainTex, UV + view_tangent_space.xy).rgb;


vec4 border = texture(_Border, UV);
vec3 lerp_ab = mix(albedo, border.rgb, border.a);

ALBEDO = lerp_ab;
}

Before checking the result in the Viewport, let’s take a moment to break down what’s hap-
pening in the code. If you look at the border variable, you’ll notice it’s a four-component
vector RGBA, which becomes important in the following operation.

In the next line, a vector called lerp_ab is declared, obtained through a linear interpolation
between albedo (the base color from the texture) and border.rgb (the decorative color
from the border texture). This interpolation is controlled using the alpha channel of the
border texture — that is, border.a.

Graphically, this means:

Chevron-Circle-Right When border.a = 0.0, the resulting color is completely albedo.


Chevron-Circle-Right When border.a = 1.0, the resulting color is completely border.rgb.
Chevron-Circle-Right For intermediate values, a smooth blend between both colors is produced.

The Godot Shaders Bible. 68


Introduction to Mesh Composition.

•••

(1.9.b Comparison between base albedo and linear interpolation across two layers)

In the Viewport, you can now observe a subtle gradient between the pupil and the rest
of the eye. This gradient, both in shape and color, comes from the eye_border texture,
which has been aesthetically adapted to blend into the overall composition. I recommend
experimenting with different colors in this texture to see the principle of linear interpolation in
action and observe how the final result changes.

The only step left is to apply the final visual layer: the eye_shine texture. To do this, you’ll
repeat the same process using another mix() function, as shown below:

The Godot Shaders Bible. 69


Introduction to Mesh Composition.

•••
void fragment()
{
float depth = texture(_Depth, UV).r;
float offset = _Offset * depth;
vec3 view_tangent_space = view_to_tangent(-TANGENT, BINORMAL, NORMAL,
↪ VIEW);
view_tangent_space *= offset;

vec3 albedo = texture(_MainTex, UV + view_tangent_space.xy).rgb;


vec4 border = texture(_Border, UV);
vec4 shine = texture(_Shine, UV);

vec3 lerp_ab = mix(albedo, border.rgb, border.a);


vec3 lerp_abs = mix(lerp_ab, lerp_ab + shine.rgb, shine.a);

ALBEDO = lerp_abs;
}

As you can see, at this stage a second interpolation has been applied. You start with the
previous result (lerp_ab, which blends the base color and the border) and combine it with
the sum of lerp_ab + shine.rgb. This addition strengthens the highlight effect, allowing
it to overlay smoothly onto the original color based on the intensity defined by the alpha
channel of the eye_shine texture.

Finally, if you return to the Viewport, you’ll be able to see the complete effect: the displace-
ment applied to the pupil along with the layered decorative borders and highlights — all
controlled through masks and the principles of linear interpolation.

The Godot Shaders Bible. 70


Introduction to Mesh Composition.

•••

(1.9.c Textures working together)

1.10 Introduction to Matrices.

As you dive deeper into shader development, you’ll frequently encounter the concept of ma-
trices. A matrix is a numerical structure made up of elements arranged in rows and columns,
following specific arithmetic and algebraic rules. In the context of graphics programming,
matrices play a crucial role in performing spatial transformations such as translation, rota-
tion, and scaling.

•••

(1.10.a On the left: a mathematical 3x3 matrix. On the right: a 3x3 matrix in .gdshader)

In shaders, matrices are primarily used to transform vertex positions from one space to
another. For instance, the reason you’re able to view a 3D object on a 2D screen is because

The Godot Shaders Bible. 71


Introduction to Mesh Composition.

its position has been transformed through a series of matrices — most commonly referred to
as MODEL_MATRIX, VIEW_MATRIX, and PROJECTION_MATRIX.

This transformation process happens automatically during the rendering pipeline. However,
you can also use these matrices directly in your shader code to apply custom transformations,
dynamically modify vertex positions, or create camera-dependent visual effects like the
billboard effect.

But how exactly does this process work? Let’s take a simple primitive — a cube — as an
example. At its core, the cube is just a collection of numerical data. There’s no actual 3D
object inside the computer. These numbers represent the cube’s geometry in local space.
So, the first logical step is to transform this data into world space using the MODEL_MATRIX.
According to Godot’s official documentation:

Quote-left
Model/local space to world space transform.

Quote-Right
•••

(1.10.b The vertices in local space have been transformed to world space)

Once the data has been transformed into world space, the cube can be correctly positioned
within the scene. However, it still isn’t visible — there’s no point of view from which to observe
it. It’s as if the cube exists, but your eyes are closed.

The Godot Shaders Bible. 72


Introduction to Mesh Composition.

To make it visible, you need to transform its position from world space to view space using
the VIEW_MATRIX, which:

Quote-left
World space to view space transform.

Quote-Right
•••

(1.10.c The vertices in world space have been transformed to view space)

At this stage, the object can already be represented from the camera’s perspective. However,
to actually display it on your computer screen, its coordinates must be converted from
view space to projected space. This is achieved through the PROJECTION_MATRIX, which,
according to Godot’s official documentation:

Quote-left
View space to clip space transform

Quote-Right
This projected space — also known as clip space — is the final stage before the rasterization
process, which ultimately converts data into the pixels you see on the screen. Therefore,
you can assume that this entire transformation process takes place within the vertex()
function.

The Godot Shaders Bible. 73


Introduction to Mesh Composition.

•••

(1.10.d The cube has been projected onto a two-dimensional screen)

Note

In Godot, you’ll find a matrix called MODELVIEW_MATRIX. This matrix combines both the model
(MODEL_MATRIX) and view (VIEW_MATRIX) transformations. As a result, it transforms a point
directly from local space to view space. https://docs.godotengine.org/en/stable/tutorials/
shaders/shader_reference/spatial_shader.html

Understanding matrices can be challenging at first. That’s why, in the following sections,
you’ll spend some time implementing each of these matrices to get a clearer picture of how
they work. You’ll also create custom matrices that allow you to dynamically transform your
object’s vertices directly from the Inspector.

1.11 Built-in Matrices.

In this section, you’ll work through a new hands-on exercise to deepen your understanding
of the internal matrices: MODEL_MATRIX, VIEW_MATRIX, and PROJECTION_MATRIX], and how
they relate to the VERTEX variable. You’ll also create a visual effect known as Billboard, using
the INV_VIEW_MATRIX.

The Godot Shaders Bible. 74


Introduction to Mesh Composition.

To keep implementation easier and keep things organized, structure your project by following
these steps:

1 Inside your project, navigate to the chapter_01 folder and create a new folder called
matrices.
2 Since this section will be divided into two parts, create two subfolders within matrices:
a built_ins.

b custom.

3 Inside each of these folders, organize the content by creating the following subfolders:
a materials.

b shaders.

c meshes.

d textures.

If you’ve followed these steps correctly, your project structure should look like this:

•••

(1.11.a The built_ins and custom folders have been added to the project)

As mentioned earlier, you’ll implement the Billboard effect using a shader. This effect
automatically orients a 3D object (or QuadMesh) to always face the camera, no matter

The Godot Shaders Bible. 75


Introduction to Mesh Composition.

where it is in the world. A great example of this is the question mark symbol on the Item Box
in Mario Kart — no matter where you stand, the symbol always faces the camera.

In your case, you’ll implement this behavior directly in the vertex() function by adjusting
the object’s orientation based on the camera’s position. To start, let’s create a few folders to
keep your project organized.

Inside the built_ins > materials folder, right-click and select:

Chevron-Circle-Right Create New > Resource > ShaderMaterial.

For convenience, name this material billboard_effect. Then, repeat the process inside the
Shaders folder: right-click and select:

Chevron-Circle-Right Create New > Resource > Shader.

As usual, make sure you give the shader the same name as the material. This naming
convention makes it easier to identify the connection between the two. If you’ve followed the
steps correctly, your project structure should now look like this:

•••

(1.11.b Both the materials and shader have been added to the project)

To begin the implementation process, start by creating a new 3D scene and add a MeshIn-
stance3D node as its child. Name this node Billboard. Then, assign a QuadMesh to its Mesh
property. After creating the billboard_effect shader and its corresponding material, make
sure to assign the material to the Material Override property of the Billboard object.

The Godot Shaders Bible. 76


Introduction to Mesh Composition.

•••

(1.11.c A QuadMesh has been added to the scene)

Now, turn your attention to the billboard_effect shader — specifically, the vertex() method.
At first glance, you’ll notice that the function appears empty. However, as you’ve learned,
several mathematical operations are performed automatically behind the scenes, such as
matrix multiplications. If you check Godot’s official documentation, you’ll find the following
explanation:

Quote-left
Users can override the modelview and projection transforms using the
POSITION built-in. If POSITION is written to anywhere in the shader, it will
always be used, so the user becomes responsible for ensuring that it always
has an acceptable value. When POSITION is used, the value from
VERTEXPOSITION is ignored and projection does not happen.

Quote-Right
To see this in action, you’ll use the render_mode skip_vertex_transform. This mode dis-
ables Godot’s automatic vertex transformations within the vertex() function, giving you full
control over how vertices are positioned and projected. However, before applying this setting,

The Godot Shaders Bible. 77


Introduction to Mesh Composition.

make sure to configure and assign a texture to the shader. This will prevent the QuadMesh
from appearing completely white or empty during the exercise.

•••
shader_type spatial;
render_mode unshaded;

uniform sampler2D _MainTex : source_color;

void vertex() { … }

void fragment()
{
vec4 albedo = texture(_MainTex, UV);
ALBEDO = albedo.rgb;
ALPHA = albedo.a;
}

In the previous code snippet, a new variable named albedo was declared as a vec4 (rep-
resenting RGBA or XYZW). This variable stores the result of sampling the _MainTex texture
using the UV coordinates. As you already know, this texture will be assigned later through
the Inspector. Unlike the examples in previous sections, this time you’re not only using the
texture’s RGB color channels but also its alpha channel, which is assigned directly to the
built-in ALPHA variable.

The ALPHA variable controls the transparency of the fragment. This allows pixels in the
texture to appear fully opaque, partially transparent, or completely invisible depending
on the alpha values defined in the image. For this exercise, you’ll use the texture named
item_box_symbol_tex, which is included in the downloadable package attached to this
book.

Once you’ve saved the shader and assigned the texture, the Billboard object in your scene
should look like this:

The Godot Shaders Bible. 78


Introduction to Mesh Composition.

•••

(1.11.d The texture has been assigned to the Billboard object)

Next, you’ll add the skip_vertex_transform property to your shader using the following
code:

•••
shader_type spatial;
render_mode unshaded;
render_mode skip_vertex_transform;

uniform sampler2D _MainTex : source_color;

void vertex() { … }

Once you add this line, your QuadMesh will disappear from the Viewport. This happens
because no vertex transformations are being performed — so nothing is being rendered. As
a result, you’ll need to manually handle the vertex transformations. To begin, transform the
mesh’s vertices from model space to view space using the MODELVIEW_MATRIX, as shown
below:

The Godot Shaders Bible. 79


Introduction to Mesh Composition.

•••
render_mode skip_vertex_transform;

uniform sampler2D _MainTex : source_color;

varying vec3 vertex_os;

void vertex()
{
vertex_os = VERTEX;
VERTEX = (MODELVIEW_MATRIX * vec4(vertex_os, 1.0)).xyz;
}

There are a few important concepts you need to understand in order to interpret the following
code correctly. Let’s start with the varying keyword. This is used to define a global variable
that is shared between the vertex() and fragment() functions. In this case, the vertex_os
vector can be accessed in both stages of the shader, allowing data to be passed from the
vertex function to the fragment function when needed.

In the first line of the vertex() method, the vertex_os vector is assigned the value of VERTEX.
This means the vertex data is still in object space — hence the _os suffix in the variable name.

Note

It’s considered good practice not only to use descriptive variable names in your shaders but also
to include suffixes that indicate the space the variable belongs to. For example: variable_os
for object space, variable_ws for world space, variable_vs for view space, variable_ss for
screen space.

Next, the VERTEX variable is transformed by multiplying the MODELVIEW_MATRIX with a four-
dimensional vector. The first three components (XYZ) represent the original position of the
vertex (vertex_os), while the fourth component (W) is set to 1.0.

Why this value? The 1.0 represents a position vector in homogeneous space. In computer
graphics, 4D vectors (vec4) are used to allow for affine transformations through 4×4 matrix
multiplication. Setting W = 1.0 tells the GPU that the vector is a position in space, meaning
it will be affected by translation, rotation, and scaling. On the other hand, if W = 0.0, the
vector is treated as a direction, which will be affected by rotation and scaling — but not by

The Godot Shaders Bible. 80


Introduction to Mesh Composition.

translation. Since vertices represent actual positions in 3D space, it’s essential to ensure that
W is set to 1.0.

You can achieve the same result by separating the transformation matrices, as shown below:

•••
void vertex()
{
vertex_os = VERTEX;
vec4 vertex_ws = MODEL_MATRIX * vec4(vertex_os, 1.0);
vec4 vertex_vs = VIEW_MATRIX * vertex_ws;
VERTEX = vertex_vs.xyz;
}

In this code snippet, as you can see, the transformation process involves only the
MODEL_MATRIX and VIEW_MATRIX. However, projection is still handled automatically as part
of the final transformation. To take full control — including the projection — you’ll need to use
the built-in POSITION variable, as shown below:

•••
void vertex()
{
vertex_os = VERTEX;
vec4 vertex_ws = MODEL_MATRIX * vec4(vertex_os, 1.0);
vec4 vertex_vs = VIEW_MATRIX * vertex_ws;
vec4 vertex_proj = PROJECTION_MATRIX * vertex_vs;
POSITION = vertex_proj;
}

If you choose this approach, you must ensure that POSITION always contains valid values.
This adds an extra level of responsibility and increases your workload when writing shaders.

Now that you understand this, how can you apply the Billboard effect directly in the shader?
The core idea of this effect is to make the object always face the camera, regardless of its
original orientation in world space.

The Godot Shaders Bible. 81


Introduction to Mesh Composition.

To achieve this, you can build a custom transformation matrix that combines the ob-
ject’s world position with the camera’s orientation. You’ll extract the first three rows of the
INV_VIEW_MATRIX, which, according to Godot’s official documentation:

Quote-left
View space to world space transform.

Quote-Right
This approach keeps the vertices in world space, but oriented from the camera’s point of
view. By combining this custom orientation matrix with the last row of the MODEL_MATRIX,
which holds the object’s position in world space, you can ensure the object always faces the
camera.

Below is an example of how to apply this logic in the vertex() function, while preserving the
previous implementation:

•••
void vertex()
{
vertex_os = VERTEX;

mat4 BILLBOARD_MATRIX = mat4(


INV_VIEW_MATRIX[0],
INV_VIEW_MATRIX[1],
INV_VIEW_MATRIX[2],
MODEL_MATRIX[3]
);

vec4 vertex_ws = BILLBOARD_MATRIX * vec4(vertex_os, 1.0);


vec4 vertex_vs = VIEW_MATRIX * vertex_ws;
vec4 vertex_proj = PROJECTION_MATRIX * vertex_vs;
POSITION = vertex_proj;
}

As shown in the example above, a new matrix called BILLBOARD_MATRIX has been declared.
This matrix reorients the object so that it always faces the camera. It is built by taking the first
three rows of INV_VIEW_MATRIX, which represent the camera’s orientation in world space.

The Godot Shaders Bible. 82


Introduction to Mesh Composition.

This ensures that the vertices remain in their original space, but are reoriented from the
camera’s point of view. The fourth row is taken from MODEL_MATRIX, preserving the object’s
world position.

If you’ve implemented everything correctly, your Billboard object should now face the camera
in the Viewport, regardless of its position in the scene.

•••

(1.11.e The Billboard object always faces the camera)

1.12 Implementing Custom Matrices.

Up to this point, you’ve focused on the built-in matrices that make it possible to render a 3D
object in a scene. But what if you wanted to create your own matrices? According to Godot’s
official documentation, you can define matrices using the following data types:

Chevron-Circle-Right mat2 for 2×2 two-dimensional matrices.


Chevron-Circle-Right mat3 for 3×3 three-dimensional matrices.
Chevron-Circle-Right mat4 for 4×4 four-dimensional matrices.

Each type represents a different kind of transformation space and is used in different scenar-
ios. For instance, you might define a matrix to scale, deform, or rotate the vertices of your
models, whether in 2D or 3D. While a mat3 might work in a 2D game, from a usability and

The Godot Shaders Bible. 83


Introduction to Mesh Composition.

optimization standpoint, it’s better to use a mat2 — since you only need to transform the 𝑥
and 𝑦 axes in a 2D space.

So, how do you declare and initialize a matrix in shader code? According to Godot’s docu-
mentation, you can do it as follows:

•••
mat2 m2 = mat2(vec2(1.0, 0.0), vec2(0.0, 1.0));
mat3 m3 = mat3(vec3(1.0, 0.0, 0.0), vec3(0.0, 1.0, 0.0), vec3(0.0, 0.0, 1.0));
mat4 identity = mat4(1.0);

These declarations create identity matrices of sizes 2×2, 3×3, and 4×4, respectively. You can
visualize them as follows:

For a two-dimensional matrix:

1 0⎞
𝑚2 = ⎛
⎜ ⎟
⎝0 1⎠

(1.12.a)

For a three-dimensional matrix:

⎛ 1 0 0⎞

⎜ ⎟

𝑚3 = ⎜
⎜ 0 1 0 ⎟

⎜ ⎟
⎝ 0 0 1 ⎠

(1.12.b)

The Godot Shaders Bible. 84


Introduction to Mesh Composition.

To define a four-dimensional matrix, simply extend the 3×3 identity matrix by adding an extra
row and column:

⎛ 1 0 0 0⎞

⎜ ⎟
⎜0 1 0 0⎟

𝑚4 = ⎜
⎜ ⎟


⎜ 0 0 1 0⎟

⎜ ⎟
⎝0 0 0 1⎠

(1.12.c)

Note

When you write mat4 identity = mat4(1.0), Godot automatically fills the main diagonal
with 1.0 and the remaining elements with zeros, creating an identity matrix. This is essential for
performing linear transformations without altering the original vector.

As a hands-on exercise, you could create a rotation matrix to visualize the gimbal effect.
However, we’ll save that exercise for Chapter 3, where you’ll explore and implement quater-
nions in .gdshader. That way, you’ll have a clearer comparison between both methods. For
now, let’s focus on creating a small custom matrix that transforms only the vertices of your
object.

To begin, go to the FileSystem, and inside the folder custom > materials > shaders, create
a new shader by selecting:

Chevron-Circle-Right Create New > Resource > Shader.

For practical purposes, name this shader transformations. Then, repeat the process in the
materials folder: right-click and select:

Chevron-Circle-Right Create New > Resource > ShaderMaterial.

Make sure to give the material the same name as the shader to keep the connection between
both resources clear. If you followed the steps correctly, your project should now look like this:

The Godot Shaders Bible. 85


Introduction to Mesh Composition.

•••

(1.12.d A material and a shader have been added to the project)

For the exercise you’re about to complete, you can use any 3D model available on your
computer. However, it’s recommended to use a primitive model, such as a generic BoxMesh.

The transformation you’ll implement will not only scale the object’s vertices but will also apply
a linear transformation known as shearing. Shearing tilts the shape of an object by moving
its vertices along a fixed direction. The amount of this displacement depends on the vertices’
positions along another axis. This transformation doesn’t change the object’s volume, but it
does alter its shape by distorting right angles.

Start by preparing your scene as follows:

Chevron-Circle-Right Create a new 3D scene.


Chevron-Circle-Right Add a MeshInstance3D as a child node.
Chevron-Circle-Right Assign a BoxMesh to the MeshInstance3D.
Chevron-Circle-Right Apply the transformations material to the mesh.

The Godot Shaders Bible. 86


Introduction to Mesh Composition.

If you’ve followed these steps correctly, your scene should now look like this:

•••

(1.12.e The transformations material has been assigned to the BoxMesh)

Since your object is three-dimensional, you’ll begin the exercise by defining a 3×3 matrix
and initializing it as an identity matrix. This ensures that no visible transformation is applied
to the 3D model at this stage. To do this, open your transformations shader and insert the
following macro above the vertex() function:

•••
shader_type spatial;

#define TRANSFORMATION_MATRIX mat3(vec3(1, 0, 0), vec3(0, 1, 0), vec3(0, 0,


↪ 1))

void vertex()
{
VERTEX = VERTEX * TRANSFORMATION_MATRIX;
}

The Godot Shaders Bible. 87


Introduction to Mesh Composition.

As you can see in the example, a macro called TRANSFORMATION_MATRIX has been defined.
It represents a 3×3 identity matrix and will be expanded wherever it’s used in the shader
code. Keep in mind that you could also define a matrix using a function or by declaring a new
variable directly inside the vertex() function. However, in this case, the #define directive is
used to improve readability, consistency, and reusability throughout development.

Note

It’s recommended to use the #define directive when: 1) You want to define reusable or constant
blocks of code, 2) You’re aiming for cleaner code organization and reduced duplication, 3) You
need to change a value globally with a single edit. However, it’s generally better to use const
or uniform variables when substitution behavior is not required, as macros don’t offer type
safety. https://docs.godotengine.org/en/stable/tutorials/shaders/shader_reference/shader_
preprocessor.html

The matrix used earlier didn’t take any parameters, but you could also define it like this:

•••
shader_type spatial;

#define TRANSFORMATION_MATRIX(s) mat3(vec3(s, 0, 0), vec3(0, s, 0), vec3(0, 0,


↪ s))

void vertex()
{
VERTEX = VERTEX * TRANSFORMATION_MATRIX(1);
}

In this version, you’ve defined a macro that takes an argument: s, which adjusts the values
of the identity matrix. This configuration is ideal for applying scale transformations to a 3D
object. You only need to change the argument to scale the object in real time:

Chevron-Circle-Right If s = 1, the BoxMesh keeps its original size.


Chevron-Circle-Right If s = 2, the BoxMesh doubles in size along all axes.
Chevron-Circle-Right If s = 0.5, the BoxMesh scales down to half its original size.

The Godot Shaders Bible. 88


Introduction to Mesh Composition.

This approach allows for quick experimentation and gives you a clear visual understanding
of the matrix’s effect without modifying multiple lines of code. Later, you could replace the
argument with a uniform variable to make the scale adjustable directly from the Inspector.

•••

(1.12.f A scale transformation has been applied to the BoxMesh)

Now, you might ask — why does the BoxMesh scale when we change the matrix value? This
happens because you’re directly altering the original positions of each vertex in the 3D object
— before those positions are transformed into world space by the MODEL_MATRIX. In other
words, the transformation is applied in object space, right at the point where each vertex
still holds its local coordinates. This lets you modify the object’s geometry directly, without
affecting its global transformation or the scene hierarchy.

You can also extend this transformation matrix to apply shearing. To do so, simply introduce
new values into the matrix, changing how one axis influences another.

For example, if you want to apply shearing along the 𝑥-axis based on the 𝑦-axis, you could
modify the matrix as follows:

•••
#define TRANSFORMATION_MATRIX(s) mat3(vec3(s, 1, 0), vec3(0, s, 0), vec3(0, 0,
↪ s))

The Godot Shaders Bible. 89


Introduction to Mesh Composition.

In this case, the value 1 has been added to the second component of the first vec3. This
means the x-axis is now influenced by the y-axis, tilting the shape of the object as each
vertex’s vertical position increases.

•••

(1.12.g Shearing along the 𝑥-axis)

The same principle applies to other axis combinations, such as x and 𝑧, or 𝑥 and 𝑦. For
example, if you want to apply shearing on the 𝑦-axis, you simply need to add a numerical
value to the first component of the second vec3 in the matrix, as shown below:

•••
#define TRANSFORMATION_MATRIX(s) mat3(vec3(s, 0, 0), vec3(1, s, 0), vec3(0, 0,
↪ s))

This setup introduces an influence from the x-axis onto the 𝑦-axis. In other words, each
vertex’s horizontal displacement now affects its vertical position, causing the object to tilt
diagonally to the side — proportional to its position along the 𝑥-axis.

The Godot Shaders Bible. 90


Introduction to Mesh Composition.

•••

(1.12.h Shearing along the 𝑦-axis)

This type of transformation is useful for creating visual effects such as animated defor-
mations, weight simulation, dynamic leaning, or even controlled vibrations. The key lies in
understanding how one axis influences another within the matrix structure. Now it’s your
turn — go ahead and experiment. Try modifying the matrix values and observe the different
results you can achieve!

The Godot Shaders Bible. 91


Lighting and Rendering.

Chapter 2

Lighting and Rendering.

The Godot Shaders Bible. 92


Lighting and Rendering.

In this chapter, you’ll continue your journey into shaders — this time with a focus on lighting
and rendering. But how exactly is light calculated in a video game? What mathematical
formulas allow us to simulate the interaction between light and objects?

Every time you apply a material to a surface — whether in 2D or 3D — you’re activating a


series of properties that go far beyond a simple texture. These properties include normal
calculations, occlusion, specular reflection, light attenuation, and other physical phenomena
that contribute to the final appearance of the object on screen.

Understanding how these processes work is essential if you want to modify, optimize, or even
reinvent them using custom shaders. Throughout this chapter, you’ll explore how the GPU
interprets light, how different types of lighting behave in Godot, and what tools you have at
your disposal to control the final look of a scene.

Before diving into practical examples, you’ll first review the basic principles behind one of the
most widely used lighting models in real-time graphics: the Lambertian model. You’ll also
learn about the built-in functions that give you access to information like the light direction,
the object’s normal, or the camera position. This foundational knowledge will prepare you to
create advanced visual effects such as toon lighting, halftone shading, or other complex
materials.

2.1 Material Properties.

Whenever you want to assign a new material to a 3D object in Godot, the engine prompts
you to choose the type of material to create. For example, if you click the <empty> field in
the Material Override property of a MeshInstance3D, you’ll see three options:

Chevron-Circle-Right StandardMaterial3D.
Chevron-Circle-Right ORMMaterial3D.
Chevron-Circle-Right ShaderMaterial.

So far, you’ve primarily worked with the third option — ShaderMaterial — as you’ve been
writing your own shaders. But what about the other two? What are they used for — and more
importantly, how do they differ? Let’s begin by comparing them. This will help you better
understand the purpose of each configuration and when it’s appropriate to use one over the
other.

The Godot Shaders Bible. 93


Lighting and Rendering.

According to the official Godot documentation, StandardMaterial3D:

Quote-left
StandardMaterial3D’s properties are inherited from BaseMaterial3D.
StandardMaterial3D uses separate textures for ambient occlusion,
roughness and metallic maps. To use a single ORM map for all 3 textures, use
an ORMMaterial3D instead.
https://docs.godotengine.org/en/stable/classes/class_standardmaterial3d.html

Quote-Right
The main difference between StandardMaterial3D and ORMMaterial3D lies in the use of
an ORM map — a single RGB texture that packs three different grayscale textures into its
individual channels:

Chevron-Circle-Right O – Occlusion: stored in the red channel (R).


Chevron-Circle-Right R – Roughness: stored in the green channel (G).
Chevron-Circle-Right M – Metallic: stored in the blue channel (B).

This technique can be a bit confusing when you’re just starting out with shaders. For example,
when you create a .png texture, it typically includes four channels by default: RGBA. So how is
it possible to store different textures in individual channels of the same image?

The key is understanding the type of information these textures represent. Occlusion, rough-
ness, and metallic values are all grayscale textures — they don’t encode color, but rather
intensity. That means all four channels in each of these images contain the same value. As a
result, you only need to store one of them in a single channel (for example, storing occlusion
only in the red channel) and ignore the rest.

Note

By taking advantage of this characteristic, you can combine three grayscale textures into a
single image — each one stored in a different channel. This technique is known as channel
packing, and it not only simplifies resource management but also improves performance by
reducing the number of textures the GPU has to load and process during rendering.

Now then, how does this compare to a ShaderMaterial? Let’s break down the differences
between these three material types to better understand their purpose and usage:

The Godot Shaders Bible. 94


Lighting and Rendering.

StandardMaterial3D,

Chevron-Circle-Right A material based on Physically Based Rendering (PBR), commonly used for 3D objects.
Chevron-Circle-Right Easy to configure, with built-in support for normal maps, roughness, metallic, occlusion,
fresnel, and many other visual properties.
Chevron-Circle-Right Supports transparency, shadows, refraction, subsurface scattering, and multiple UV
coordinates.
Chevron-Circle-Right The shader is predefined — so no coding is required to use it. However, it can be
GPU-intensive, especially on low-end devices.

ORMMaterial3D,

Chevron-Circle-Right Like StandardMaterial3D, this is a PBR material used for 3D objects, with the key differ-
ence being that it uses a packed ORM texture.
Chevron-Circle-Right Retains all the features of StandardMaterial3D, but is slightly more optimized by re-
ducing the number of texture samplers.
Chevron-Circle-Right It also uses a predefined shader, so no custom code is needed.

ShaderMaterial,

Chevron-Circle-Right Unlike the previous two, this material does not include any predefined properties — you
have full control to define everything manually.
Chevron-Circle-Right Can be used for both 2D and 3D objects, depending on how you configure it.
Chevron-Circle-Right While it includes basic lighting by default, taking full advantage of its potential requires
writing code in Godot’s shader language (.gdshader);
Chevron-Circle-Right Its performance impact depends entirely on your shader code — the more complex
the operations, the greater the load on the GPU.

In conclusion, StandardMaterial3D and ORMMaterial3D are ready-to-use materials built


into Godot. Both are excellent for quickly getting started on a project without writing any
code, as they come with built-in lighting calculations and basic visual effects.

On the other hand, ShaderMaterial gives you full control over what is rendered on screen.
This makes it the perfect tool for creating custom visual effects, stylized materials, or even
complex simulations. In fact, with the right approach, you can fully replicate the behavior of
the built-in materials from scratch — just by writing your own shader code.

The Godot Shaders Bible. 95


Lighting and Rendering.

That’s exactly what you’ll do throughout this chapter. You’ll recreate many of the visual
features found in Godot’s default materials, but with a focus on custom shader-based effects.
This hands-on practice will give you a deep understanding of the lighting calculations
involved in material creation — and prepare you to build shaders tailored to your project’s
artistic vision.

2.2 Introduction to the Lambertian Model.

One of the simplest — and most widely used — lighting models in computer graphics is the
Lambertian model, named after an observation made by Johann Heinrich Lambert in the
18th century. But how does this model actually work? To understand it, you need to consider
two key variables:

Chevron-Circle-Right The direction of the light source.


Chevron-Circle-Right The surface normal.

The amount of light hitting a surface depends on the angle between that surface and the
direction of the light. If the surface faces the light directly, it receives the maximum possible
illumination. If it faces tangentially — or away from the light — it receives little to no light.

Let’s look at its mathematical definition to better understand how this works:

𝐷 = 𝑚𝑎𝑥(0, 𝑛 ⋅ 𝑙) 𝑖 𝑘

(2.2.a)

The Godot Shaders Bible. 96


Lighting and Rendering.

Where,

Chevron-Circle-Right 𝐷 represents the intensity of diffuse light received by a point on the surface.
Chevron-Circle-Right 𝑘 is the diffuse coefficient or the base color of the surface (typically assigned to the
ALBEDO variable in the fragment() function).
Chevron-Circle-Right 𝑖 is the intensity of the light source.
Chevron-Circle-Right 𝑛 is the surface normal (a unit vector).
Chevron-Circle-Right 𝑙 is the light direction (also a unit vector).

The dot product between 𝑛 and 𝑙, written as (𝑛 ⋅ 𝑙), is equivalent to the cosine of the angle 𝜃
between the two vectors. Since both 𝑛 and 𝑙 are unit vectors (their lengths equal 1), their dot
product can be directly interpreted as 𝑐𝑜𝑠(𝜃). This relationship is incredibly useful for both
developing lighting formulas and implementing them in shaders.

While it’s true that you could expand this equation to include additional elements — such
as light color, distance attenuation, or other physical factors — this simplified version of the
Lambertian model is enough to help you understand the core logic behind diffuse lighting
calculations.

•••

(2.2.b Diagram illustrating Lambertian shading)

If you look at the diagram in Figure 2.2.b, you’ll notice that it includes the viewer (or camera)
as part of the geometry. However, the camera is not present in Equation 2.2.a. Why is that?

The Godot Shaders Bible. 97


Lighting and Rendering.

The reason is simple: the Lambertian model describes purely diffuse reflection. This means
light is scattered uniformly in all directions, regardless of the angle from which the surface is
observed. In other words, the resulting illumination is independent of the viewer or camera
position.

This behavior contrasts with more complex lighting models — like Phong or Blinn-Phong —
which do account for the viewer’s direction when calculating specular reflection. In those
cases, the view direction must be included as an additional variable in per-pixel lighting
calculations.

Despite its simplicity, implementing the Lambertian model in Godot is straightforward thanks
to a set of built-in variables provided by the engine. For example:

Chevron-Circle-Right 𝐷 corresponds to the built-in variable DIFFUSE_LIGHT.


Chevron-Circle-Right 𝑛 is the surface normal, represented by NORMAL.
Chevron-Circle-Right 𝑙 is the light direction, represented by LIGHT.
Chevron-Circle-Right 𝑖 is the light intensity, which you can approximate using ATTENUATION.

Finally, the term 𝑘 — the surface color — is represented by the ALBEDO variable. However, you
don’t need to include it manually in the equation, because Godot automatically multiplies
the diffuse light by ALBEDO internally during the rendering process.

It’s also important to note that lighting calculations in Godot are performed inside the light()
function, which is commented out by default. Here’s what a basic Lambertian model imple-
mentation looks like in Godot’s .gdshader language:

•••

(2.2.c Colors have been defined to identify the variables)

The Godot Shaders Bible. 98


Lighting and Rendering.

The first step in implementing this model is to enable the light() function. In practice, this
method works as an extension of fragment(), since it also runs once per pixel — but with a
key difference: it’s called once for every light that affects that pixel.

By default, this function is commented out because it’s optional — you’ll only use it if you
want to customize how lighting interacts with the material. But why can this be costly for
the GPU? You already know that fragment() runs once for every visible pixel on screen. For
example, in a game running at 1920×1080 resolution, fragment() is called around 2,073,600
times per frame. The light() function, however, executes not just per pixel, but per light
affecting that pixel. If a single pixel is affected by four different light sources, light() will
be called four times for that same pixel — potentially resulting in over 8 million executions
per frame, depending on the complexity of the scene. That’s why it’s important to use
this function only when necessary, and with care — especially if your project needs to run
efficiently on lower-end devices.

2.3 Implementing the Lambertian Model.

In this section, you’ll complete a series of tasks to help you not only understand how the
Lambertian model works but also how to use the light() function in Godot. You’ll also
explore the ambient_light_disabled render mode, which will let you clearly identify which
light sources are affecting your character by default.

Here’s what you’ll accomplish:

Chevron-Circle-Right Work with the light() function.


Chevron-Circle-Right Implement the Lambertian lighting model.
Chevron-Circle-Right Transform the lighting result into a simple toon-style effect.

Before diving into the technical implementation, let’s first organize your project, just like in
previous chapters. Follow these steps:

1 Inside your project, under the assets folder, create a new subfolder named chapter_02
to store all resources for this chapter.
2 Inside chapter_02, create another folder named lambert.

The Godot Shaders Bible. 99


Lighting and Rendering.

3 Within the lambert folder, organize your content by creating the following subfolders:

a materials.

b shaders.

c meshes.

d textures.

If you’ve followed the steps correctly, your project structure should now look like this:

•••

(2.3.a The chapter_02 folder has been added under assets)

Next, you’ll implement the example illustrated in Figure 2.2.c. To do so, you’ll create a new
shader and enable the light() function in order to put the various aspects of the Lambertian
model into practice.

For this exercise, you’ll use the suzanne.fbx model, which should be imported into the meshes
folder you created earlier. This file is included in the downloadable package that comes with
the book.

Note

Since the model uses the .fbx extension, you’ll need to follow the process described in Section
1.4 of the previous chapter. That involves opening the file using New Inherited Scene and then
selecting Make Unique to save an editable copy of the model within your project.

The Godot Shaders Bible. 100


Lighting and Rendering.

Start by creating a new material. Right-click the materials folder and choose:

Chevron-Circle-Right Create New > Resource > ShaderMaterial.

For practical purposes, name this material lambert. Then, create the associated shader. In
the shaders folder, right-click and select:

Chevron-Circle-Right Create New > Resource > Shader.

Use the same name — lambert — for the shader. This helps maintain a clear and organized
relationship between the two resources.

If everything has been configured correctly, your project structure should now look like this:

•••

(2.3.b The resources have been added and Suzanne has been marked as unique)

At this point, you could begin editing the shader right away. However, before doing so, it’s
important to set up the scene properly. This is crucial because enabling the light() function
will change how the object is lit — giving you a clearer view of how Godot’s default lighting
system behaves.

The Godot Shaders Bible. 101


Lighting and Rendering.

For the scene setup, use a main Node3D as the root, and add a MeshInstance3D as its child.
Assign the previously saved suzanne.tres file to this mesh. As part of this process, keep the
following in mind:

Chevron-Circle-Right The lambert shader must be assigned to its corresponding ShaderMaterial.


Chevron-Circle-Right The ShaderMaterial must be applied to the mesh — specifically in the Material Override
property of the MeshInstance3D.
Chevron-Circle-Right Add a DirectionalLight3D node to the scene. This will allow you to observe how the
lighting changes as you modify the light’s direction.

If all steps have been followed correctly, your scene should now look like this:

•••

(2.3.c The character has been configured in the scene)

As mentioned earlier, the first step is to enable the light() function in your shader. To do
this, open the lambert shader and uncomment the light() function, allowing its content to
run.

The Godot Shaders Bible. 102


Lighting and Rendering.

Here’s the initial structure of the shader:

•••
shader_type spatial;

void vertex() { … }

void fragment() { … }

void light()
{
// Called for every pixel for every light affecting the material.
// Uncomment to replace the default light processing function with this…
}

Because the light() function replaces Godot’s default light processing, you’ll notice an
immediate change in the object’s appearance once it’s activated. Specifically, the direct
lighting that was affecting the 3D model will appear to disappear.

And we say partially disappears because, although direct lighting is overridden, the model is
still influenced by ambient lighting, which Godot applies globally by default. This ambient
light comes from the environment settings and is not tied to any specific lights in the scene,
such as DirectionalLight3D.

This distinction is useful — it allows you to clearly separate the visual impact of ambient light
from that of directional light as you begin implementing the Lambertian model.

The Godot Shaders Bible. 103


Lighting and Rendering.

•••

(2.3.d The character is affected by ambient lighting)

If you want to disable ambient lighting, you need to use the ambient_light_disabled render
mode. This directive tells the engine to completely ignore any lighting contributions from the
environment — that is, the global illumination Godot applies by default when no explicit light
sources are present.

Once you add this line, your character will appear completely dark. This is expected: there
are no active light sources affecting the model visually. However, this condition is quite useful
for the lighting tests you’re about to perform, as it allows you to observe only the results of
your custom lighting calculations inside the light() function.

Your base shader code with the directive applied will look like this:

•••
shader_type spatial;
render_mode ambient_light_disabled;

void vertex() { … }

After applying this render mode — and with a directional light active in the scene — you
should see the model appear entirely black.

The Godot Shaders Bible. 104


Lighting and Rendering.

•••

(2.3.e No light source is affecting our 3D model)

Now that your character is properly configured, it’s time to implement the lighting function
shown in Figure 2.2.c, as demonstrated below:

•••
void light()
{
float i = ATTENUATION;
vec3 l = LIGHT;
vec3 n = NORMAL;
float D = max(0.0, dot(n, l)) * i;

DIFFUSE_LIGHT += vec3(D);
}

Note

You can also incorporate the light’s color into this operation by using the LIGHT_COLOR variable,
as described in Godot’s official documentation on Light Built-ins: https://docs.godotengine.
org/en/stable/tutorials/shaders/shader_reference/spatial_shader.html

It’s important to mention that the variable names in this function (𝑖, 𝑙, 𝑛, 𝐷) follow traditional
conventions when implementing the Lambertian model. However, in a production setting, it
would be preferable to use more descriptive names or operate directly with Godot’s built-in

The Godot Shaders Bible. 105


Lighting and Rendering.

variables. Avoiding temporary variables that are only used once can improve both readability
and performance.

A more concise and production-ready version of the same calculation might look like this:

•••
DIFFUSE_LIGHT += vec3(max(0.0, dot(NORMAL, LIGHT)) * ATTENUATION);

That said, for educational purposes, this chapter will continue to use intermediate variables
to help reinforce the connection to the original mathematical model. Once you save your
changes and return to the scene, you should see an immediate change in Suzanne’s lighting
— she now responds only to the direction and intensity of the directional light, as dictated by
the Lambertian model you’ve just implemented.

•••

(2.3.f The Lambertian model has been applied to Suzanne)

As you can see, the 3D model now displays only two tones: light and shadow (white and
black). This outcome is mathematically accurate — it faithfully reflects the behavior of the
Lambertian model in its most basic form.

But why does this simple effect create a sense of volume in the character? The key lies in
the angle between the surface normals and the direction of the light. As this angle changes
across different points on the model, so does the amount of light that hits each point. This
smooth variation in intensity is what creates gradual shading across curved surfaces — like

The Godot Shaders Bible. 106


Lighting and Rendering.

Suzanne’s cheeks and forehead — allowing you to perceive depth and contour, even in the
absence of color.

Let’s take a closer look at this principle in the next figure, where the relationship between the
angle 𝜃, the surface normal, and the light direction is illustrated graphically.

•••

(2.3.g The Lambertian model has been conceptually applied to a sphere)

A sphere — like any 3D model — has a normal vector at each of its vertices. However, for
illustration purposes, the previous figure highlights only four of them. Each normal points in a
different direction, depending on the curvature and volume of the surface.

If you look closely, you’ll notice that the angle between the normals and the light direction
varies from 0° to 180°. This angular relationship is fundamental to the Lambertian model, as
it determines how much light each part of the object receives.

Let’s revisit how the dot() function behaves in this context:

Chevron-Circle-Right It returns 1.0 (white) when the angle between the normal and the light is 0° — meaning
the light hits the surface perpendicularly.
Chevron-Circle-Right It returns 0.0 (black) when the angle is 90° — the light hits the surface tangentially.
Chevron-Circle-Right It returns –1.0 (completely dark, not useful for diffuse lighting) when the angle is 180° —
the light is pointing in the opposite direction.

The Godot Shaders Bible. 107


Lighting and Rendering.

However, the Lambertian model uses the function max(0.0, dot(n, l)), meaning any
negative value is discarded. That’s because light coming from “behind” the surface should
have no visual effect on it.

Since diffuse light can’t be negative in the real world, allowing values below zero could
introduce visual artifacts, especially in areas where lighting should have no effect at all. For
that reason, max() is used to ensure that light contributions are always zero or greater.

In some cases, this function is replaced by clamp(), which restricts the result to both a
minimum and a maximum value. This can be useful if you want to ensure the output stays
within a specific range — for example, between 0.0 and 1.0.

The max() function for a 3D vector can be defined like this:

•••
vec3 max(vec3 a, vec3 b)
{
return vec3(
a.x > b.x ? a.x : b.x,
a.y > b.y ? a.y : b.y,
a.z > b.z ? a.z : b.z
);
}

Now that you understand the fundamentals of Lambertian shading, you can take a creative
step further by limiting the gradient between lit (white) and shadowed (black) regions. This
results in a more stylized visual effect. One way to achieve this is by using the smoothstep()
function, which performs a smooth interpolation between two values.

The Godot Shaders Bible. 108


Lighting and Rendering.

Here’s how the function is defined for floating-point values:

•••
float smoothstep(float edge0, float edge1, float x)
{
float t = clamp((x - edge0) / (edge1 - edge0), 0.0, 1.0);
return t * t * (3.0 - 2.0 * t);
}

The function takes three arguments: an input value x, and two edges edge0 and edge1. It
returns a result between 0.0 and 1.0, forming a smooth transition. This is especially useful
when you want more control over gradient blending or to simulate soft cutoff regions in
lighting.

You can apply smoothstep() directly to the output of the Lambertian model to narrow
the range of intermediate values. This sharpens the contrast between light and shadow,
producing a more graphic, toon-like style.

•••
void light()
{
float i = ATTENUATION;
vec3 l = LIGHT;
vec3 n = NORMAL;
float D = max(0.0, dot(n, l)) * i;
D = smoothstep(0.0, 0.05, D);

DIFFUSE_LIGHT += vec3(D);
}

This operation creates a binary lighting effect, where the transition between bright and dark
areas is more abrupt, yet still smooth enough to avoid harsh edges. It’s a perfect technique
for achieving a stylized look, particularly for art-driven shading models like toon rendering.

The Godot Shaders Bible. 109


Lighting and Rendering.

•••

(2.3.h Smoothed gradient)

Another interesting effect you can implement is to adjust the lighting tone your character
receives. You can achieve this using the mix() function, which smoothly interpolates between
two colors and returns a new vec3 based on a blend factor.

•••
void light()
{
float i = ATTENUATION;
vec3 l = LIGHT;
vec3 n = NORMAL;
float D = max(0.0, dot(n, l)) * i;

D = smoothstep(0.0, 0.05, D);


vec3 diffuse = mix(vec3(0.042, 0.023, 0.534), vec3(1.0), D);

DIFFUSE_LIGHT += diffuse;
}

In this case, the variable D is used as a blend factor between two colors: a dark bluish tone
vec3(0.042, 0.023, 0.534) and pure white vec3(1.0). As D increases — meaning the
surface receives more direct light — the resulting color gradually approaches white.

Conversely, in areas that receive little to no light (where D is close to 0), the color retains its
blue tint. This behavior produces a stylized and custom shading effect, making it ideal for
scenes with an artistic aesthetic or projects that require non-photorealistic rendering (NPR).

The Godot Shaders Bible. 110


Lighting and Rendering.

•••

(2.3.i The resulting shadow has a bluish tint)

2.4 Vector: Points vs. Directions.

Up to this point, we’ve been working exclusively with the implementation of the Lambertian
model. If you’ve followed the steps correctly, you’re likely experimenting with different values
to achieve various lighting tones in Suzanne’s rendering. But this process raises an essential
question: Why does it work? Why does changing certain vectors produce noticeable shifts in
the character’s lighting?

One key concept that wasn’t explicitly mentioned in the previous section is this: in the context
of lighting, the vectors you manipulate — such as the light direction or surface normal —
represent directions, not points in space.

So, what’s the difference between a point and a direction? A point defines a specific position
in 3D space. For example, a vertex on a mesh represents a corner of the model in either local
or global coordinates. It tells you where something is.

In contrast, a direction indicates an orientation without referring to a fixed position. It tells


you where something is pointing, not where it’s located. Vectors like LIGHT, VIEW, and NORMAL
are unit directions — they describe the direction of the light source, the viewing angle of the
camera, or the orientation of the surface at a given fragment.

The Godot Shaders Bible. 111


Lighting and Rendering.

•••

(2.4.a A point (purple) vs. a direction (orange))

Understanding this distinction is critical, especially as you dive deeper into lighting calcula-
tions in this chapter. When you calculate the dot product between the normal and the light
direction, you’re not comparing spatial positions. Instead, you’re evaluating the angular
relationship between two orientations. This angular relationship determines how much light
hits a surface — and therefore, how bright or shadowed it appears.

2.5 Introduction to the Blinn-Phong Model.

In this section, we’ll talk about the specular effect, the one found in the Metallic > Specular
property of a StandardMaterial3D. This effect lets you render surfaces with shiny reflections,
simulating how light reflects off polished materials.

To understand how it works, you need to consider three main properties:

Chevron-Circle-Right The direction of the light source.


Chevron-Circle-Right The surface normal.
Chevron-Circle-Right The view direction (camera).

The Godot Shaders Bible. 112


Lighting and Rendering.

•••

(2.5.a On the left, Lambert; on the right, Specular)

This model, widely used in computer graphics, originates from the work of Bui Tuong Phong,
who proposed a method for adding specular highlights to a surface based on the orientation
of its normals.

According to the original Phong model, if you want to simulate specular reflectance, you
must perform the following mathematical operation:

𝑆 = max(0, r ⋅ v)𝑚

(2.5.b)

Here, 𝑟 is the reflection vector of the light, 𝑣 is the view direction (camera), 𝑛 is the normal,
and 𝑚 is the shininess exponent, which controls the concentration of the reflection. A low
value for 𝑚 (for example, 𝑚 = 8) produces a wide, soft highlight, while a high value (such
as 𝑚 = 256) results in a small, sharp reflection.

The Phong model implemented in .gdshader would look like this:

The Godot Shaders Bible. 113


Lighting and Rendering.

•••

(2.5.c Colors have been defined to identify each variable)

However, in real-time graphics like Godot, it’s common to use an optimized variant known
as the Blinn-Phong model. Instead of calculating the reflection vector, this model uses an
intermediate vector called halfway (ℎ), which represents the midpoint between the light
direction and the view direction. This leads to a new specular formula, which is visually similar
to the Lambertian model (Figure 2.2.a), but with a different approach.

𝑆 = max(0, n ⋅ h)𝑚 (n ⋅ l > 0)

(2.5.d)

Its implementation in .gdshader looks like this:

•••

(2.5.e Colors have been defined to identify each variable)

The Godot Shaders Bible. 114


Lighting and Rendering.

The more closely the object’s normal aligns with the ℎ vector, the stronger the specular
highlight will be. This produces smooth, realistic reflections that depend on the viewing angle
— unlike the Lambertian model, which does not take the camera’s position into account.

•••

(2.5.f Diagram illustrating the Blinn-Phong model)

2.6 Implementing the Blinn-Phong Model.

In this section, you’ll implement the Blinn-Phong lighting model inside the light() function
using the two configurations previously introduced in Figures 2.5.b and 2.5.d. You’ll also learn
how to enhance the appearance of specular highlights by converting lighting from linear
space to sRGB, which is essential for achieving more accurate visual results on screen.

To do this, follow these steps:

Chevron-Circle-Right We’ll start from the Lambertian shader we already implemented.


Chevron-Circle-Right Then, we’ll add both variants of the specular equation.
Chevron-Circle-Right Finally, we’ll apply a color conversion from linear to sRGB to enhance the highlights.

The Godot Shaders Bible. 115


Lighting and Rendering.

Before writing any code, let’s organize the project to keep everything structured and consis-
tent:

1 Inside the chapter_02 folder, create a new subfolder named blinn_phong.


2 Inside this folder, add the following subfolders:
a materials.

b shaders.

You won’t need to import new models or textures this time, as you’ll reuse the resources from
the previous section. If you followed the steps correctly, your project structure should now
look like this:

•••

(2.6.a The blinn_phong folder has been added to the project)

As mentioned earlier, you’ll continue working from the lambert shader created in Section 2.3.
However, to preserve the original shader, duplicate it before making any changes. To do this,
right-click the shader file and select Duplicate (or press Ctrl + D).

Once duplicated, rename the file to blinn_phong and move it to the following path:

Chevron-Circle-Right chapter_02 > blinn_phong > materials > shaders.

For the scene setup, you’ll reuse the one based on the Suzanne model. Following the same
logic, duplicate the file lambert.tscn, rename it to blinn_phong.tscn, and save it in:

Chevron-Circle-Right chapter_02 > blinn_phong.

The Godot Shaders Bible. 116


Lighting and Rendering.

Next, create a new material. Right-click the materials folder and choose:

Chevron-Circle-Right Create New > Resources > ShaderMaterial.

Name this material blinn_phong as well, so that all elements are properly linked and easy
to identify in this section.

If you’ve followed all the steps correctly, your project structure should now look like this:

•••

(2.6.b The material and scene from lambert have been duplicated)

Note

After creating the shader and material, make sure to assign the shader to the material, and
then apply the material to the 3D model in the scene. This is necessary in order to visualize the
effects of your code changes in the final render.

Since the Suzanne scene is already set up, you can now proceed to edit the blinn_phong
shader. Start by restructuring the Lambertian model logic into a separate function. This will
improve readability and make your code easier to reuse.

The Godot Shaders Bible. 117


Lighting and Rendering.

•••
float lambert(float i, vec3 l, vec3 n)
{
return max(0.0, dot(n, l)) * i;
}

void light()
{
float i = ATTENUATION;
vec3 l = LIGHT;
vec3 n = NORMAL;

float d = lambert(i, l, n);

DIFFUSE_LIGHT += vec3(d);
}

As you can see, the Lambertian model logic remains unchanged. The only difference is that
the main operation — the dot product between the normal and the light direction, scaled by
intensity — is now encapsulated in a separate function called lambert().

Note

In this updated shader, the properties _Shadow, _Highlight, and _Smoothness have been
removed, since this section focuses exclusively on implementing the Blinn-Phong model.

This change improves code modularity, making it easier to reuse the function elsewhere or
tweak it later if needed.

Now, implement the Phong model, as defined by the equation shown in Figure 2.5.b. To do
this, create a new function named phong(), which calculates the specular component using
the view vector, light direction, surface normal, and shininess exponent:

The Godot Shaders Bible. 118


Lighting and Rendering.

•••
float phong(vec3 v, vec3 l, vec3 n, float m)
{
vec3 r = reflect(-l, n);
return pow(max(0.0, dot(r, v)), m);
}

Note

While Godot provides the built-in SPECULAR variable inside the fragment() function for simpli-
fied specular computation, implementing this function manually helps you understand how
the model works in detail — and, more importantly, how to customize it to your needs.

With both functions ready, you can now update the light() method to include the specular
calculation from the Phong model:

•••
28 void light()
29 {
30 float i = ATTENUATION;
31 vec3 l = LIGHT;
32 vec3 n = NORMAL;
33 vec3 v = VIEW;
34

35 float d = lambert(i, l, n);


36 float s = phong(v, l, n, 64.0);
37

38 DIFFUSE_LIGHT += vec3(d) * 0.3;


39 SPECULAR_LIGHT += vec3(s);
40 }

In this code snippet, line 33 introduces a new variable v to store the view direction — necessary
for computing the specular component with the phong() function. The result is stored in the
scalar variable s (line 36) and added to SPECULAR_LIGHT (line 39).

This modular approach will allow you to replace the Phong model with its Blinn-Phong variant
in the next section, while keeping the rest of the logic intact.

The Godot Shaders Bible. 119


Lighting and Rendering.

•••

(2.6.c Phong specular with m = 64.0)

Note

In line 38, the diffuse light value is multiplied by 0.3 to reduce its intensity and ensure the
specular component is clearly visible. Also, remember that Godot enables ambient lighting by
default. If you’d like to disable it for more controlled testing, use the ambient_light_disabled
render mode.

Now you’ll implement the Blinn-Phong model. As mentioned earlier, this model changes the
specular calculation by using an intermediate vector called halfway (or half-vector), which
represents the average direction between the light and the view. This technique improves
performance on certain platforms and creates smoother visual results, especially for soft
highlights.

To implement it, define a new function named blinn_phong(), which corresponds to the
equation in Figure 2.5.d:

•••
float blinn_phong(vec3 v, vec3 l, vec3 n, float m)
{
vec3 h = normalize(l + v);
float s = pow(max(0.0, dot(n, h)), m);
s *= float(dot(n, l) > 0.0);
return s;
}

The Godot Shaders Bible. 120


Lighting and Rendering.

The line vec3 h = normalize(l + v) calculates the halfway vector by normalizing the sum
of the light (LIGHT) and view (VIEW) directions. Then, the dot product between n (NORMAL) and
h determines the specular intensity, raised to the exponent m, which controls the sharpness
of the highlight. The result is multiplied by 1.0 or 0.0 depending on whether the angle between
n and l is positive — preventing unwanted highlights in shadowed regions.

To apply this model, simply replace the phong() call with blinn_phong() inside the light()
method. The updated code looks like this:

•••
36 void light()
37 {
38 float i = ATTENUATION;
39 vec3 l = LIGHT;
40 vec3 n = NORMAL;
41 vec3 v = VIEW;
42

43 float d = lambert(i, l, n);


44 //float s = phong(v, l, n, 64.0);
45 float s = blinn_phong(v, l, n, 64.0);
46

47 DIFFUSE_LIGHT += vec3(d) * 0.3;


48 SPECULAR_LIGHT += vec3(s);
49 }

As shown in line 45, the s variable now uses blinn_phong() instead of phong(). Although
the internal logic differs, both models share the same arguments:

Chevron-Circle-Right View direction v.


Chevron-Circle-Right Light direction l.
Chevron-Circle-Right Surface normal n.
Chevron-Circle-Right Shininess exponentm.

This change results in a visual output similar to Phong, but with subtle differences in the
shape and distribution of the specular highlight.

The Godot Shaders Bible. 121


Lighting and Rendering.

•••

(2.6.d Blinn-Phong specular with m = 64.0)

If you compare the results in Figures 2.6.c and 2.6.d, you’ll notice that both models produce a
similar effect. However, the Blinn-Phong highlight appears slightly broader and smoother.
This difference stems from the use of the halfway vector instead of the reflection vector,
which distributes the highlight’s intensity differently across the surface.

Regardless of whether you use Phong or Blinn-Phong, keep in mind that both models operate
in linear lighting space. Mathematically, this is correct. However, human vision does not
perceive light linearly — we are more sensitive to changes in darkness than in brightness.

For this reason, when aiming for realistic lighting or a more visually striking result, it’s recom-
mended to convert your shader’s output from linear space to sRGB, which better matches
human perception. This conversion helps enhance highlights and improves the visual fidelity
of your materials.

Since this section has already covered a lot, we’ll explore color space and sRGB conversion
in more detail in the next section.

2.7 From Linear Lighting to sRGB.

When we talk about sRGB (Standard Red Green Blue), we’re referring to the international
standard IEC 61966-2-1, amendment 1 (2003), which precisely defines the behavior of the sRGB
color space. This standard not only describes the color gamut that can be represented on a
screen but also specifies how colors should be stored, interpreted, and displayed consistently
across different devices.

The Godot Shaders Bible. 122


Lighting and Rendering.

Until this point in the chapter, you’ve explored various spaces and coordinate systems. But
what exactly does “color space” mean? In simple terms, it refers to three key aspects:

Chevron-Circle-Right What colors can be represented.


Chevron-Circle-Right How those colors are stored in memory (encoding).
Chevron-Circle-Right How to interpret a color — for example, vec3(0.143, 0.456, 0.673) — consistently
across different devices and displays.

Understanding this concept is crucial when developing shaders in Godot. By default, the
images or textures you use (such as those exported from Photoshop in sRGB format) already
include gamma correction. However, real-time lighting calculations must be done in linear
space for physically correct results.

For this reason, it’s common to find conversions between linear space and sRGB at different
stages of the shader workflow. In practical terms, this means:

Chevron-Circle-Right When reading color values from albedo or diffuse textures (which are encoded in sRGB),
you first need to convert them to linear space.
Chevron-Circle-Right When writing the final lighting result to the screen, you need to convert the colors back
to sRGB so they appear correctly.

Note

sRGB color space should be used only for color images when exporting from editing software
like Photoshop. Grayscale images (such as normal maps, roughness, etc.) must remain in
linear space to preserve data accuracy in the [0.0 : 1.0] range.

The Godot Shaders Bible. 123


Lighting and Rendering.

To simplify these conversions in your shaders, you can define helper functions that handle
the process automatically. Below are two functions that will let you convert between color
spaces accurately:

To convert from sRGB to linear space:

•••
vec3 to_linear(vec3 srgb)
{
vec3 a = pow((srgb + 0.055) / 1.055, vec3(2.4));
vec3 b = srgb / 12.92;
bvec3 c = lessThan(srgb, vec3(0.04045));
return mix(a, b, c);
}

To convert from linear space to sRGB:

•••
vec3 to_sRGB(vec3 linearRGB)
{
vec3 a = vec3(1.055) * pow(linearRGB.rgb, vec3(1.0/2.4)) - vec3(0.055);
vec3 b = linearRGB.rgb * vec3(12.92);
bvec3 c = lessThan(linearRGB, vec3(0.0031308));
return vec3(mix(a, b, c));
}

These functions allow for accurate transformation between color spaces, ensuring that
highlights and shadows in your materials are perceived correctly by the human eye.

Let’s pay attention to the following reference to understand the concept:

The Godot Shaders Bible. 124


Lighting and Rendering.

•••

(2.7.a Top: sRGB, Bottom: Linear)

To illustrate the conversion from linear to sRGB space, Figure 2.7.a uses the first channel of
the UV coordinates (uv.x) as a reference. This example helps you visualize how luminance
values are redistributed through gamma correction: mid-tones are brightened, while values
close to black occupy a smaller portion of the dynamic range.

The same principle can be applied to any value calculated in linear space. In your case, you’ll
apply this conversion to the specular component to achieve a sharper and more visually
appealing reflection.

However, keep in mind that the to_sRGB() function returns a three-component vector (vec3),
while the blinn_phong() method returns a scalar value (float). To resolve this mismatch,
you’ll redefine the s variable as a three-dimensional vector and apply the color space
conversion afterward.

The Godot Shaders Bible. 125


Lighting and Rendering.

The following code snippet shows this modification:

•••
36 vec3 to_sRGB(vec3 linearRGB)
37 {
38 vec3 a = vec3(1.055) * pow(linearRGB.rgb, vec3(1.0/2.4)) - vec3(0.055);
39 vec3 b = linearRGB.rgb * vec3(12.92);
40 bvec3 c = lessThan(linearRGB, vec3(0.0031308));
41 return vec3(mix(a, b, c));
42 }
43

44 void light()
45 {
46 float i = ATTENUATION;
47 vec3 l = LIGHT;
48 vec3 n = NORMAL;
49 vec3 v = VIEW;
50

51 //vec3 d = vec3(lambert(i, l, n));


52 vec3 s = vec3(blinn_phong(v, l, n, 64.0));
53 s = to_sRGB(s);
54

55 DIFFUSE_LIGHT += vec3(0.0);
56 SPECULAR_LIGHT += s;
57 }

If you look at line 36, you’ll see that the to_sRGB() function has been added to the shader.
Then, in line 52, the variable s is defined as a vec3 to store the result of the blinn_phong()
function. In line 53, that value is converted from linear space to sRGB.

Note

In this example, diffuse light has been disabled (DIFFUSE_LIGHT += vec3(0.0)) to focus
exclusively on the specular effect. It’s also recommended to disable ambient lighting using the
ambient_light_disabled render mode, allowing you to observe the impact of sRGB conversion
on the highlight more clearly.

If everything has been implemented correctly, the visual result will show a more intense and
noticeable specular highlight that better matches the behavior expected by the human eye.

The Godot Shaders Bible. 126


Lighting and Rendering.

•••

(2.7.b Left: Linear specular; Right: sRGB specular)

Although this example applies the linear-to-sRGB conversion only to the specular component,
you can also apply it to the diffuse component, since it is also calculated in linear space.

Every game has its own visual style and unique rendering needs. So, feel free to experiment
with these conversions and tweak the values to achieve a result that aligns with the artistic
direction of your project.

2.8 Introduction to the Rim (Fresnel) Effect.

The Rim effect, also known as the Fresnel effect (named after French physicist Augustin-Jean
Fresnel), is an optical phenomenon that describes how the amount of reflected light on a
surface changes depending on the viewing angle. In practical terms, this effect appears as
a glowing edge around the outer parts of an object when viewed from oblique angles.

If you were using a StandardMaterial3D, you could easily enable this effect through the
Rim property, which is disabled by default. However, in this section, you’ll learn how to
implement the rim effect manually in a shader, giving you greater creative control and a
deeper understanding of how it works internally.

The Godot Shaders Bible. 127


Lighting and Rendering.

•••

(2.8.a Left: Lambert; Right: Rim)

How does the Fresnel effect work exactly? The principle behind the Fresnel effect is relatively
simple: when viewing a surface head-on — that is, when the view vector and surface normal
are parallel (with an angle close to 0°) — the amount of reflected light is minimal or even
nonexistent. In contrast, when you observe the surface at a grazing angle — when the view
vector is perpendicular to the normal (with an angle close to 90°) — reflection reaches its
maximum value.

•••

(2.8.b Diagram illustrating the Fresnel or Rim effect)

The Godot Shaders Bible. 128


Lighting and Rendering.

This behavior produces a noticeable highlight along the edges of objects. Visually, it appears
as a luminous silhouette around the model, which can be used to enhance contours or to
create stylized effects, similar to an outline or an artistic glow.

From a mathematical perspective, a common way to approximate the rim effect in shaders
is with the following equation:

𝐹 = (1.0 − max(0.0, n ⋅ v))𝑚

(2.8.c)

Where n is the surface normal, v is the view vector normalized from the surface point to the
camera, and 𝑚 is the exponent that controls the sharpness of the effect. The lower the value
of 𝑚 (e.g., 𝑚 = 1.0), the smoother and more spread out the rim effect will be. Conversely,
higher values (e.g., 𝑚 > 5.0) produce a thinner, sharper, and brighter edge.

You can implement this function directly in your .gdshader using the Godot Shader Language
as shown below:

•••

(2.8.d Colors have been defined to identify each variable)

The image above shows both the mathematical equation for the Rim (Fresnel) effect and
its direct implementation in shader code.

It’s important to note that in Godot, this effect is already built into the material system.
According to the official documentation, you can access the rim effect value from the
fragment() function using the built-in RIM variable.

The Godot Shaders Bible. 129


Lighting and Rendering.

The behavior of RIM is influenced by the ROUGHNESS property, which affects how the
illuminated edge is scattered. However, we’ll explore this in more detail in the next section,
where you’ll learn how to manually integrate and customize the rim effect in a visual shader.

2.9 Implementing the Rim Effect.

In this section, you’ll implement the Rim effect using two different approaches: first, by relying
on Godot’s built-in shader variables, and then by applying the custom function you defined
in the previous section (Figure 2.8.d).

The goal is to understand both the simplified method provided by the engine and the un-
derlying technical foundations of the effect, so you can apply it more flexibly in various
contexts.

To achieve this, you’ll follow these steps:

Chevron-Circle-Right We’ll duplicate the blinn_phong shader and rename it so you can work from that
version.
Chevron-Circle-Right Also, we’ll duplicate the blinn_phong.tscn scene, keeping the Suzanne model as your
reference.
Chevron-Circle-Right Finally, we’ll Implement both approaches to the Rim effect inside the shader to compare
their behavior.

Before writing code, organize your folder structure to keep your project tidy. Proceed with the
following:

Chevron-Circle-Right Inside chapter_02, create a new folder called rim_fresnel.


Chevron-Circle-Right Inside this folder, add the following subfolders:
Chevron-Circle-Right materials.
Chevron-Circle-Right shaders.

Since you’ll continue using the Suzanne model, there’s no need to add new meshes or textures.

Next, duplicate the blinn_phong shader and move it to the following path:

Chevron-Circle-Right rim_fresnel > materials > shaders.

The Godot Shaders Bible. 130


Lighting and Rendering.

Once it’s in place, rename the shader to rim_fresnel to keep it consistent with the section
name.

Now create a new ShaderMaterial. Right-click on the materials folder and select:

Chevron-Circle-Right Create New > Resource > ShaderMaterial.

Name the material rim_fresnel as well, so its connection to the shader is clear and easy to
identify.

If you’ve followed the steps correctly, your project structure should now look like this:

•••

(2.9.a The shader and material have been added to the rim_fresnel section)

Before implementing the functions inside the shader, make sure everything is connected
correctly. Assign the rim_fresnel shader to its corresponding material, and then apply this
material to the Suzanne model inside the rim_fresnel.tscn scene. This step is essential to
visualize the changes made during this section.

To understand how the Rim effect works in Godot, start by briefly reviewing the official
documentation, where the built-in variables available in both the fragment() and light()
functions are listed.

Note

You can find this information at the following link: https://docs.godotengine.org/en/stable/


tutorials/shaders/shader_reference/spatial_shader.html

The Godot Shaders Bible. 131


Lighting and Rendering.

In the Fragment built-ins section, you’ll find the internal RIM variable, which represents the
intensity of the rim effect already calculated by the engine. You can use this variable directly
in the fragment() function, as shown below:

•••

(2.9.b RIM output variable)

However, you’ll notice that RIM doesn’t appear in the Light built-ins section of the documen-
tation. Why is that?

In Godot, when working only with the fragment() function, you are responsible for assigning
values directly to internal variables such as ALBEDO, SPECULAR, EMISSION, and others. In this
mode, the engine automatically handles lighting calculations, and any modifications (such
as applying the rim effect with RIM) are immediately reflected on screen.

In contrast, when you define the light() function, you activate deferred lighting mode for
that material. This changes the shader’s behavior: Godot disables part of the automatic
logic in fragment() and expects you to manually calculate the lighting components. As a
result, many internal variables like RIM no longer have a visual effect in fragment() when
light() is present.

In other words:

Chevron-Circle-Right Without light(): the shader handles everything automatically. You only need to write
code in fragment() and assign values to built-in variables.
Chevron-Circle-Right With light(): you must manually recalculate lighting for every light affecting the
fragment. The fragment() function loses some of its visual impact.

The Godot Shaders Bible. 132


Lighting and Rendering.

Note

In .gdshader documentation, many internal variables include qualifiers such as in, out, and
inout, which indicate their behaviour in the pipeline:

Chevron-Circle-Right in vec3 VIEW: read-only in fragment(). Contains the direction from the fragment to
the camera.
Chevron-Circle-Right inout vec3 NORMAL: comes interpolated from the vertex() function and can be read
and modified in fragment() before lighting is calculated.

This means that if you use the v variable inside fragment() while light() is active, you won’t
see any effect, since the deferred lighting system ignores it.

To properly study how RIM works, it’s best to begin by disabling the light() function. This
way, you can observe how it behaves visually in fragment() without interference.

Here’s how you’ll proceed:

Chevron-Circle-Right Temporarily comment out the content of the light() function.


Chevron-Circle-Right Use the internal variables RIM, ROUGHNESS, and RIM_TINT inside fragment() to observe
the effect on the Suzanne model.

Comment out the light() function in the rim_fresnel shader like this:

The Godot Shaders Bible. 133


Lighting and Rendering.

•••
/*
void light()
{
float i = ATTENUATION;
vec3 l = LIGHT;
vec3 n = NORMAL;
vec3 v = VIEW;

vec3 d = vec3(lambert(i, l, n));


vec3 s = vec3(blinn_phong(v, l, n, 128.0));
s = to_sRGB(s);

DIFFUSE_LIGHT += d;
SPECULAR_LIGHT += s;
}
*/

As you can see, the entire light() function has been commented out, disabling its logic
temporarily. Once you’ve made this change, your model should look like this:

•••

(2.9.c Lighting is calculated from the fragment stage)

Even with the light() function commented out, your model still shows lighting — mainly
from diffuse and ambient light. This happens because, without a custom light() function,
Godot defaults to its internal lighting system within the fragment() stage.

The Godot Shaders Bible. 134


Lighting and Rendering.

To enable the rim effect in this context, simply use the internal variables RIM, ROUGHNESS, and
RIM_TINT in the fragment() function. Here’s a basic implementation:

•••
void fragment()
{
vec3 albedo = texture(_MainTex, UV).rgb;

ROUGHNESS = 0.8;
RIM = 1.0;
RIM_TINT = 0.5;
ALBEDO = albedo;
}

Let’s break down the role of each variable:

Chevron-Circle-Right RIM controls the intensity of the Rim effect on the model: 0.0 disables it completely,
while 1.0 applies it fully.
Chevron-Circle-Right ROUGHNESS affects the thickness of the illuminated edge. Values near 0.0 produce a
sharp, narrow effect; values near 1.0 create a wider, softer effect.
Chevron-Circle-Right RIM_TINT determines how the Rim effect blends with the material color. At 0.0, the rim
is completely white. At 1.0, it blends with the material’s color (overlay mode).

•••

(2.9.d Rim effect applied to Suzanne in the fragment stage)

The Godot Shaders Bible. 135


Lighting and Rendering.

As shown above, applying the rim effect in the fragment() function is quite simple thanks to
Godot’s internal variables. However, if you now reactivate the light() function, you’ll notice
the effect disappears. This is because defining light() shifts responsibility for lighting to that
function, and the values set in fragment() for properties like RIM, ROUGHNESS, or RIM_TINT
no longer take effect.

To restore the rim effect while the light() function is active, you’ll implement a custom
version based on the equation shown in Figure 2.8.d. To do this:

Chevron-Circle-Right First, comment out the RIM, ROUGHNESS, and RIM_TINT lines in the fragment() function
so they don’t interfere with rendering.
Chevron-Circle-Right Then, inside the light() function, add the following code:

•••
float fresnel(vec3 n, vec3 v, float m, float s)
{
float f = 1.0 - max(0.0, dot(n, v));
return s * pow(f, m);
}

Although this function was called rim() in Figure 2.8.d, the functionality remains the same.
Here, a new argument s has been added to serve as an intensity multiplier, allowing for better
visual control of the effect.

Now integrate the Fresnel effect directly into the light() function:

The Godot Shaders Bible. 136


Lighting and Rendering.

•••
55 void light()
56 {
57 float i = ATTENUATION;
58 vec3 l = LIGHT;
59 vec3 n = NORMAL;
60 vec3 v = VIEW;
61

62 float f = fresnel(n, v, 5.0, 2.0);


63 vec3 d = vec3(lambert(i, l, n));
64 d += vec3(f);
65 vec3 s = vec3(blinn_phong(v, l, n, 128.0));
66 s = to_sRGB(s);
67

68 DIFFUSE_LIGHT += d;
69 SPECULAR_LIGHT += s;
70 }

Analyzing the code, in line 62 you define a new floating-point variable f, which stores the
result of the fresnel() method. Then in line 64, this value is added to the diffuse component
d, creating the visual rim highlight effect.

The values m = 5.0 (exponent) and s = 2.0 (multiplier) were chosen for demonstration
purposes. Feel free to experiment with different values to achieve the style you want.

•••

(2.9.e Rim effect implemented in the light stage)

The Godot Shaders Bible. 137


Lighting and Rendering.

2.10 Introduction to the Anisotropic Effect.

Up to this point, you’ve explored several specular reflection models to understand how to
interpret and implement mathematical functions in the .gdshader language. However, the
models used so far represent only a small subset of what’s available.

When working with an anisotropic reflection model, the first question you should ask is:
Which model will work best for my project? While it’s true that several models can produce
similar visual results under certain conditions, the differences can become significant when
considering factors like visual finish, computational cost, or optimization.

Some common anisotropic specular models include:

Chevron-Circle-Right Ashikhmin-Shirley.
Chevron-Circle-Right Ward.
Chevron-Circle-Right Cook-Torrance.
Chevron-Circle-Right Anisotropic Phong.
Chevron-Circle-Right Anisotropic Blinn-Phong.

To keep the focus of this book accessible and practical, you’ll examine two of these: first, the
anisotropic Phong model, due to its simplicity and low computational cost; and then the
Ashikhmin-Shirley model, which is more complex but offers a higher-quality visual result.

Start by analyzing the following mathematical expression:

𝑎𝑢 (h ⋅ t)2 + 𝑎𝑣 (h ⋅ b)2
(𝑎 + 1)(𝑎𝑣 + 1) 1 − (h ⋅ n)2
𝐴= 𝑢 ⋅ (h ⋅ n)
8𝜋

(2.10.a)

This equation is an extension of the classic Phong model, adapted to reflect anisotropy by
using different exponents for the tangent and bitangent directions. In this formula:

Chevron-Circle-Right 𝑎𝑢 is the exponent along the tangent direction (𝑡).


Chevron-Circle-Right 𝑎𝑣 is the exponent along the bitangent direction (𝑏).
Chevron-Circle-Right ℎ is the halfway vector (LIGHT + VIEW).

The Godot Shaders Bible. 138


Lighting and Rendering.

Chevron-Circle-Right 𝑛 is the normal (NORMAL).


Chevron-Circle-Right 𝑡 is the tangent (TANGENT).
Chevron-Circle-Right 𝑏 is the binormal (BINORMAL).

At first glance, this equation might seem complex, but the idea is straightforward: by adjusting
the parameters 𝑎𝑢 and 𝑎𝑣 , you distort the circular shape of the specular highlight into an
ellipse aligned to the surface’s orientation. This distortion helps simulate materials such as
brushed metal, hair, or glossy fabrics, where light reflects in a directional pattern.

Below is a visual illustration of how varying 𝑎𝑢 and 𝑎𝑣 affects the specular highlight:

•••

(2.10.b Anisotropic specular deformation in both directions)

As shown in Figure 2.10.b, the 𝑎𝑢 variable controls the width of the specular reflection, while
𝑎𝑣 controls its height. In other words, increasing 𝑎𝑢 extends the reflection horizontally along
the tangent, and increasing 𝑎𝑣 stretches it vertically along the bitangent.

If you want to implement Equation 2.10.a in your shader, you could do it like this:

The Godot Shaders Bible. 139


Lighting and Rendering.

•••

(2.10.c Colors have been defined to identify each variable)

This model allows you to deform the shape of the specular reflection in a controlled way
by adjusting 𝑎𝑢 and 𝑎𝑣 to achieve the desired effect. While it’s not a physically accurate
model and doesn’t conserve energy precisely, its low computational cost makes it suitable
for stylized rendering or non-photorealistic effects.

You can now visualize the effect on a sphere:

The Godot Shaders Bible. 140


Lighting and Rendering.

•••

(2.10.d Anisotropic specular reflection on a sphere)

As seen in Figure 2.10.d, the anisotropic effect produces a highly directional specular reflection
whose shape depends on the values assigned to 𝑎𝑢 and 𝑎𝑣 . This deformation of the specular
lobe is useful for representing materials where reflection is distributed unevenly.

However, the full implementation above can be optimized using a simpler and more efficient
approximation that still yields a visually acceptable result in performance-sensitive contexts.

This simplified version is defined by the following expression:

2 2
𝐴 = max(n ⋅ h, 0.001)𝑎𝑢 (h ⋅ t) + 𝑎𝑣 (h ⋅ b)

(2.10.e)

Which, when implemented in code, looks like this:

The Godot Shaders Bible. 141


Lighting and Rendering.

•••

(2.10.f Colors have been defined to identify each variable)

Unlike the first version (based on Equation 2.10.a), this approximation produces a smoother
and more efficient specular reflection without requiring additional divisions. In practice, this
kind of simplification is ideal for stylized effects or low-end devices, where computational
savings make a significant impact.

The Godot Shaders Bible. 142


Lighting and Rendering.

•••

(2.10.g Left: Classic Phong; Right: Simplified Phong)

As shown in Figure 2.10.g, the classic Phong model (left) produces a sharper, more focused
specular lobe, while the simplified version (right) softens the result by removing the normal-
ization factor. Although both models are useful in stylized contexts, if you require a more
accurate and physically grounded representation of anisotropy, you’ll need to rely on more
advanced models.

One of the most prominent models in this category is Ashikhmin-Shirley, whose mathemat-
ical formulation is defined as follows:

𝑎𝑢 (h ⋅ t)2 + 𝑎𝑣 (h ⋅ b)2
√(𝑎𝑢 + 1)(𝑎𝑣 + 1) (n ⋅ h) 1 − (n ⋅ h)2
𝐴=
8𝜋 (v ⋅ h) max(n ⋅ l, n ⋅ v)

(2.10.h)

This model evolves from the anisotropic Phong model but introduces several critical elements
that bring it closer to real-world light behavior over rough surfaces:

Chevron-Circle-Right The numerator includes a square root normalization factor to ensure energy conserva-
tion.
Chevron-Circle-Right The exponent applied to 𝑛 ⋅ ℎ takes into account the orientation of the halfway vector
with respect to the tangent and bitangent, enabling precise deformation of the specular
lobe.

The Godot Shaders Bible. 143


Lighting and Rendering.

Chevron-Circle-Right The denominator introduces view-angle dependence through 𝑣 ⋅ ℎ and the maximum
of 𝑛 ⋅ 𝑙 and 𝑛 ⋅ 𝑣, improving angular reflectance response.

What makes this model especially interesting is that it strikes a balance between visual
precision and artistic control, making it suitable for creating realistic materials with directional
microstructure.

The original Ashikhmin-Shirley model also includes an approximation of the Fresnel effect,
using Schlick’s formula, defined as:

𝐹 = 𝑟𝑠 + (1 − 𝑟𝑠 )(1 − (v ⋅ h))5

(2.10.i)

This term improves the material’s behavior at grazing angles by increasing the intensity of the
specular reflection as the view direction moves away from the surface normal. However, its
use is optional depending on the visual style you’re aiming for. Both the Ashikhmin-Shirley
model and the Fresnel term can be implemented in a single method in Godot as shown
below:

The Godot Shaders Bible. 144


Lighting and Rendering.

•••

(2.10.j Colors have been defined to identify each variable)

In this implementation:

Chevron-Circle-Right 𝑟𝑠 represents the base reflectance.


Chevron-Circle-Right 𝑎𝑢 and 𝑎𝑣 control anisotropy along the tangent and bitangent directions, respectively.
Chevron-Circle-Right The formula includes a correction term to avoid division by zero and improve numerical
stability.

The resulting visual effect is shown in the next figure:

The Godot Shaders Bible. 145


Lighting and Rendering.

•••

(2.10.k Ashikhmin-Shirley model)

As you can see, this model produces a sharper and more directional specular lobe, whose
shape depends on the orientation of the halfway vector with respect to the tangent and
bitangent. This level of precision makes it especially useful for simulating materials like hair,
satin fabrics, brushed metals, or any surface with a structured, realistic directional reflection.

2.11 Implementing the Ashikhmin-Shirley Model.

In this section, you’ll put the Ashikhmin-Shirley model into practice by applying it to the
hair of an anime-style character. The goal is to observe how this model behaves on more
complex geometry and how its anisotropic component can enhance the visual style of the
render — especially for materials like hair, which tend to reflect light directionally.

To begin, let’s organize the project to maintain a clear and consistent structure. Follow these
steps:

Chevron-Circle-Right Inside the chapter_02 folder, create a new folder called anisotropy.
Chevron-Circle-Right Inside this folder, organize the content as follows:
Chevron-Circle-Right materials.
Chevron-Circle-Right shaders.
Chevron-Circle-Right meshes.
Chevron-Circle-Right textures.

The Godot Shaders Bible. 146


Lighting and Rendering.

Next, create the material that will be applied to the character’s hair. Right-click on the
materials folder and select:

Chevron-Circle-Right Create New > Resources > ShaderMaterial.

Name this material character_hair.

Then, create the corresponding shader by right-clicking on the shaders folder and selecting:

Chevron-Circle-Right Create New > Resources > Shader.

Name this file anisotropic_reflection, keeping it consistent with the section title.

For this section, you’ll use an anime-style character along with its corresponding textures
and materials, all of which are available as part of the book’s downloadable package.

Note

This book includes a downloadable package that contains all the necessary files to follow
along with each exercise. You can download it directly from: https://jettelly.com/store/
the-godot-shaders-bible

If everything has been configured correctly, your project structure should now look like this:

The Godot Shaders Bible. 147


Lighting and Rendering.

•••

(2.11.a Project structure)

Begin the implementation by dragging the character_model into the Viewport. Make sure
to center its position by setting its transform values to (0.0𝑥 , 0.0𝑦 , 0.0𝑧 ).

An interesting aspect of this model is how its materials are configured through the Surface
Material Override property. If you select the MeshInstance3D node (named CharacterModel),
you’ll see that it includes three material slots, labeled 0, 1, and 2.

These slots correspond to the materials assigned in the 3D modeling software used to create
the asset (in this case, Maya):

Chevron-Circle-Right Slot 0 refers to the character’s eyes. Assign the character_eyes material here.

The Godot Shaders Bible. 148


Lighting and Rendering.

Chevron-Circle-Right Slot 1 corresponds to the hair. Assign the character_hair material to this slot. Make sure
the anisotropic_reflection shader is already assigned to the material before applying
it to the model — just as you’ve done in previous sections.
Chevron-Circle-Right Slot 2 refers to the character’s skin. Assign the character_skin material in this slot.

Once the materials are assigned correctly — and since no code has been added to the
anisotropic_reflection shader yet — the character’s hair will appear completely flat, without
any anisotropic reflection. Its appearance in the Viewport should look like this:

•••

(2.11.b The materials have been assigned to the character)

Obviously, this will change once you start implementing the necessary functions in your
shader. To do that, open the file anisotropic_reflection.gdshader and go to the fragment()
function.

Next, add the following lines of code:

The Godot Shaders Bible. 149


Lighting and Rendering.

•••
shader_type spatial;

uniform sampler2D _MainTex;

void vertex()
{
// Called for every vertex the material is visible on.
}

void fragment()
{
vec3 albedo = texture(_MainTex, UV).rgb;
ALBEDO = albedo;
}

//void light() {
// Called for every pixel for every light affecting the material.
// Uncomment to replace the default light processing function ...
//}

As you can see, you’ve declared a new three-dimensional vector named albedo, which stores
the result of sampling the _MainTex texture using the model’s UV coordinates. This value
is then assigned to the built-in ALBEDO variable, allowing the texture color to be correctly
applied to the geometry.

After making this change, assign the texture character_hair_tex to the Main Tex property in
the Inspector, under the character_hair material. This texture contains the base color of the
character’s hair.

The Godot Shaders Bible. 150


Lighting and Rendering.

•••

(2.11.c The texture has been assigned to the character’s hair)

Although the current result does not yet reflect the effect we’re aiming for, it serves as a
starting point to understand the implementation we’ll carry out in this section. The Ashikhmin-
Shirley model needs to be implemented inside the light() function, but before going back
to the shader, we need to add a light source to the scene. To do this, add a DirectionalLight3D
by selecting the Node3D and choosing:

Chevron-Circle-Right Add Child Node > DirectionalLight3D

This will give us control over the direction and intensity of the light, which is essential to
observe how the reflection behaves on the character’s hair.

Returning to the .gdshader file, enable the light() function by uncommenting it. When
doing this, you’ll notice the character’s hair becomes unlit. This happens because all lighting
calculations are now delegated to this function, and we haven’t defined any logic inside
it yet. Despite this, the character will still be affected by ambient light, which is handled
automatically.

To properly structure our implementation, we’ll begin by separating the necessary functions
into blocks, which will make the code easier to understand and maintain. We’ll use the
equation presented in Figure 2.10.j from the previous section as our base.

The Godot Shaders Bible. 151


Lighting and Rendering.

Declare the function just between the fragment() and light() methods. For practical
purposes, we’ll call it ashikhmin_shirley():

•••
void fragment()
{
vec3 albedo = texture(_MainTex, UV).rgb;
ALBEDO = albedo;
}

float ashikhmin_shirley()
{
return 0.0;
}

void light()
{

This model requires several variables to perform its calculations. Below is the list of arguments
we’ll pass to the function, which include reflection factors, orientation vectors, and geometric
surface properties:

Chevron-Circle-Right 𝑟𝑠 : Reflection factor.


Chevron-Circle-Right 𝑎𝑢 : tangent direction.
Chevron-Circle-Right 𝑎𝑣 : binormal direction.
Chevron-Circle-Right 𝑛: the NORMAL variable.
Chevron-Circle-Right 𝑣: the VIEW variable.
Chevron-Circle-Right 𝑙: the LIGHT variable.
Chevron-Circle-Right 𝑡: the TANGENT variable.
Chevron-Circle-Right 𝑏: the BINORMAL variable.

The Godot Shaders Bible. 152


Lighting and Rendering.

Next, we’ll include these variables as arguments in the function:

•••
float ashikhmin_shirley(float rs, float au, float av, vec3 n, vec3 l, vec3 v,
↪ vec3 t, vec3 b)
{
return 0.0;
}

In the next step, we’ll implement the body of this function by dividing it into three logical
blocks, based on the mathematical formula presented in Figure 2.10.j. This breakdown will
allow us to interpret and develop the model step by step in a clear and structured way.

•••

(2.11.d The anisotropic model functions have been separated)

The Godot Shaders Bible. 153


Lighting and Rendering.

To implement the model, we’ll use the general equation shown in Figure 2.11.d as our founda-
tion. One strategy we can follow is to break it down into smaller blocks. For example, if we
look at the specular equation, we’ll notice several operations based on the dot product:

Chevron-Circle-Right 𝑛 ⋅ 𝑙 (NdotL).
Chevron-Circle-Right 𝑛 ⋅ 𝑣 (NdotV).
Chevron-Circle-Right 𝑛 ⋅ ℎ (NdotH).
Chevron-Circle-Right 𝑣 ⋅ ℎ (VdotH).

Since some of these variables are used in both the exponent and the Fresnel term, it’s most
efficient to define them at the beginning of our function:

•••
float ashikhmin_shirley(float rs, float au, float av, vec3 n, vec3 l, vec3 v,
↪ vec3 t, vec3 b)
{
vec3 h = normalize(l + v);
float NdotL = max(dot(n, l), 0.0001);
float NdotV = max(dot(n, v), 0.0001);
float NdotH = max(dot(n, h), 0.0001);
float VdotH = max(dot(n, h), 0.0001);

return 0.0;
}

There are two important points in this initial block:

Chevron-Circle-Right The variable h is the halfway vector, calculated as a normalized sum of the light and
view directions. Normalization ensures its length is exactly 1.0.
Chevron-Circle-Right The use of max(^^..., 0.0001) avoids values close to or equal to 0.0, which could cause
visual errors or rendering artifacts, especially at grazing angles.

The Godot Shaders Bible. 154


Lighting and Rendering.

Next, we can incorporate the numerator and denominator of the model, leaving the exponent
undefined for now:

•••
float ashikhmin_shirley(float rs, float au, float av, vec3 n, vec3 l, vec3 v,
↪ vec3 t, vec3 b)
{
vec3 h = normalize(l + v);
float NdotL = max(dot(n, l), 0.0001);
float NdotV = max(dot(n, v), 0.0001);
float NdotH = max(dot(n, h), 0.0001);
float VdotH = max(dot(n, h), 0.0001);

float exponent; // <-- not initialized

float specular = sqrt((au + 1.0) * (av + 1.0)) * pow(NdotH, exponent);


specular /= (8.0 * PI) * VdotH * max(NdotL, NdotV);

return 0.0;
}

According to Figure 2.11.d, the exponent depends on three components:

Chevron-Circle-Right ℎ ⋅ 𝑡 (HdotT).
Chevron-Circle-Right ℎ ⋅ 𝑏 (HdotB).
Chevron-Circle-Right 𝑛 ⋅ ℎ (NdotH).

We’ve already defined (𝑛 ⋅ ℎ), so we only need to add (ℎ ⋅ 𝑡) and (ℎ ⋅ 𝑏), as shown below:

The Godot Shaders Bible. 155


Lighting and Rendering.

•••
float ashikhmin_shirley(float rs, float au, float av, vec3 n, vec3 l, vec3 v,
↪ vec3 t, vec3 b)
{
vec3 h = normalize(l + v);
float NdotL = max(dot(n, l), 0.0001);
float NdotV = max(dot(n, v), 0.0001);
float NdotH = max(dot(n, h), 0.0001);
float VdotH = max(dot(n, h), 0.0001);

float HdotT = dot(h, t);


float HdotB = dot(h, b);

float exponent = au * pow(HdotT, 2.0) + av * pow(HdotB, 2.0);


exponent /= 1.0 - pow(NdotH, 2.0);

float specular = sqrt((au + 1.0) * (av + 1.0)) * pow(NdotH, exponent);


specular /= (8.0 * PI) * VdotH * max(NdotL, NdotV);

return 0.0;
}

While the Ashikhmin-Shirley model can function without the Fresnel term, we’ll include it to
complete the equation and improve the specular response on object edges:

•••
float specular = sqrt((au + 1.0) * (av + 1.0)) * pow(NdotH, exponent);
specular /= (8.0 * PI) * VdotH * max(NdotL, NdotV);

float f = rs + (1.0 - rs) * pow(1.0 - VdotH, 5.0);


specular *= f;

return specular;

The Godot Shaders Bible. 156


Lighting and Rendering.

Now, to use this function inside the light() method, we must keep in mind the following:

Chevron-Circle-Right The TANGENT and BINORMAL variables are not directly available in light(). To solve this,
we’ll declare them as varying variables and assign their values inside the fragment()
method.
Chevron-Circle-Right The rs, au, and av variables must be declared as uniform so they can be adjusted from
the Inspector.

Let’s proceed to declare and initialize the required global and uniform variables so our shader
can properly use the anisotropic model inside light():

•••
1 shader_type spatial;
2

3 uniform sampler2D _MainTex;


4 uniform float _AU : hint_range(0.0, 256.0, 0.1);
5 uniform float _AV : hint_range(0.0, 256.0, 0.1);
6 uniform float _ReflectionFactor : hint_range(0.0, 1.0, 0.1);
7

8 varying vec3 _tangent;


9 varying vec3 _binormal;
10

11 void vertex() { … }
15 //Line break (12 to 14)
16 void fragment()
17 {
18 _tangent = TANGENT;
19 _binormal = BINORMAL;
20

21 vec3 albedo = texture(_MainTex, UV).rgb;


22 ALBEDO = albedo;
23 }

If we examine the code, we’ll notice the material properties are declared between lines 4
and 6. Then, in lines 8 and 9, the varying variables that will store the tangent and binormal
vectors are declared. These are initialized in the fragment() method, as shown in lines 18
and 19. This ensures that the TANGENT and BINORMAL vectors, which are unavailable inside
light(), can be accessed through _tangent and _binormal.

The Godot Shaders Bible. 157


Lighting and Rendering.

Now that everything is correctly set up, we can go to the light() method and use the
ashikhmin_shirley() model. To do this, we declare a new float variable called aniso, and
initialize it with the result of the method. The code block looks like this:

•••
void light()
{
float aniso = ashikhmin_shirley(_ReflectionFactor, _AU, _AV,
NORMAL, LIGHT, VIEW, _tangent, _binormal);

SPECULAR_LIGHT += aniso;
}

Since anisotropic reflection is an advanced form of specularity, the result is directly assigned
to the SPECULAR_LIGHT output.

To check how this reflection affects our character in the Viewport, follow these steps:

Chevron-Circle-Right Select the character_hair material.


Chevron-Circle-Right Go to the Inspector.
Chevron-Circle-Right Set one of the properties, Au or Av, to its maximum value (e.g., 256.0).
Chevron-Circle-Right Keep the other property (Au or Av) at 0.0.
Chevron-Circle-Right Set the Reflection Factor to 1.0.

After applying these adjustments, the character’s hair will begin to show anisotropic highlights
whose shape depends on the orientation of the tangent vector assigned to the model.

The result in the Viewport should look as follows:

The Godot Shaders Bible. 158


Lighting and Rendering.

•••

(2.11.e Material properties have been modified)

However, the anisotropic reflection presents a few rendering issues. For example:

Chevron-Circle-Right The reflection affects the hair uniformly, regardless of the light direction. This is an issue
because if we look at the underside of the character’s hair, we’ll notice that reflections
are still rendered in that area.
Chevron-Circle-Right The reflection remains consistent with the 3D model’s shape. However, it should be
distorted according to the hair texture.
Chevron-Circle-Right The hair is still only affected by ambient lighting, so it continues to look flat.

We’ll address these problems in our shader, starting with the first one.

To prevent the reflection from affecting areas that should remain in shadow, we’ll multiply
the result of the anisotropic model by the Lambert diffuse term, which is useful because:

Chevron-Circle-Right It returns values close to 1.0 (white) when the normal points in the same direction as
the light.
Chevron-Circle-Right It returns values close to 0.0 (black) when the normal opposes the direction of the light.

This allows us to automatically attenuate reflections in regions with no direct lighting.

The Godot Shaders Bible. 159


Lighting and Rendering.

•••
float lambert(float i, vec3 l, vec3 n)
{
return max(0.0, dot(n, l)) * i;
}

void light()
{
float aniso = ashikhmin_shirley(_ReflectionFactor, _AU, _AV,
NORMAL, LIGHT, VIEW, _tangent, _binormal);
aniso *= lambert(ATTENUATION, LIGHT, NORMAL);

SPECULAR_LIGHT += aniso;
}

By implementing the Lambert model in our code, the graphical result will better match
the scene’s lighting conditions. Specular highlights will now fade out in shadowed areas,
producing a more realistic result.

The render should now look like this:

•••

(2.11.f Left: aniso only; Right: aniso multiplied by Lambert)

As you can see in Figure 2.11.f, applying the Lambert model helps eliminate reflections in
areas that should be shadowed.

The Godot Shaders Bible. 160


Lighting and Rendering.

To solve the second issue mentioned earlier, we need to modulate the anisotropic reflection
using the albedo channels. This is done by declaring a new vec3 inside the light() method
and copying the current values of the ALBEDO variable:

•••
void light()
{
float aniso = ashikhmin_shirley(_ReflectionFactor, _AU, _AV,
NORMAL, LIGHT, VIEW, _tangent, _binormal);

vec3 albedo = ALBEDO;


aniso *= albedo.r * albedo.g * albedo.b;
aniso = smoothstep(0.0, 0.1, aniso);
aniso *= lambert(ATTENUATION, LIGHT, NORMAL);

SPECULAR_LIGHT += aniso;
}

As you can see, the anisotropic effect is modulated through the operation
aniso *= albedo.r * albedo.g * albedo.b. This multiplies the reflection value by
the combined intensity of the RGB color channels, allowing us to attenuate the specular
component based on the overall brightness of the texture.

To address the third issue, we simply assign the value of albedo directly to the DIFFUSE_LIGHT
output variable, which allows us to see the texture’s colors in their original tones:

The Godot Shaders Bible. 161


Lighting and Rendering.

•••
void light()
{
float aniso = ashikhmin_shirley(_ReflectionFactor, _AU, _AV,
NORMAL, LIGHT, VIEW, _tangent, _binormal);

vec3 albedo = ALBEDO;


aniso *= albedo.r * albedo.g * albedo.b;
aniso = smoothstep(0.0, 0.1, aniso);
aniso *= lambert(ATTENUATION, LIGHT, NORMAL);

DIFFUSE_LIGHT+= albedo;
SPECULAR_LIGHT += aniso;
}

This configuration produces the following visual result:

•••

(2.11.g Albedo and anisotropic specularity working together)

The Godot Shaders Bible. 162


Procedural Shapes and Vertex Animation.

Chapter 3

Procedural Shapes and Vertex


Animation.

The Godot Shaders Bible. 163


Procedural Shapes and Vertex Animation.

In this chapter, you will explore several fascinating topics, including quaternions, matrix
rotations, and two-dimensional procedural shapes. For the latter, you will work with UV
coordinates. Since these are Cartesian coordinates, you will draw geometric areas on them
using simple mathematical formulas and limits.

You will also examine a fundamental set of tools for noise generation — a key resource for
creating organic materials, non-repetitive animations, and natural simulations. We will focus
on the noise functions available in GLSL and HLSL, explaining their internal logic, their core
differences, and how to use them efficiently within Godot.

When it comes to rotations, you will learn how to use quaternions as an alternative to tra-
ditional matrices. You will see why quaternions are widely used in 3D graphics and how to
implement them in GDSL to avoid issues such as gimbal lock. You will also explore how to
build custom rotation matrices for animating vertices or manipulating geometry.

Finally, you will be introduced to Compute Shaders — a powerful tool that allows you to
execute parallel code directly on the GPU, beyond the traditional graphics pipeline. While
their use in Godot is still limited, understanding how they work will help you expand your
technical possibilities and prepare for more advanced effects in future versions of the engine.

Each of these topics will be presented with practical examples to help you visualize their
usefulness within an artistic and technical workflow. The ultimate goal is to expand your
set of tools and techniques so you can solve complex problems through the smart use of
custom shaders.

3.1 Drawing with Functions and Inequalities.

When we talk about functions and inequalities, we are referring to mathematical expressions
that can be applied directly to Cartesian coordinates. Some of the most common include:

Chevron-Circle-Right Linear functions.


Chevron-Circle-Right Quadratic functions.
Chevron-Circle-Right Trigonometric functions.
Chevron-Circle-Right Absolute value functions.
Chevron-Circle-Right Exponential functions.

The Godot Shaders Bible. 164


Procedural Shapes and Vertex Animation.

Chevron-Circle-Right And more.

Each of these functions has unique properties that, when combined, allow you to construct
complex shapes procedurally. While a deep analysis of their behavior and a detailed break-
down of their implementation in Godot’s shader language could easily fill an entire book,
in this chapter we will take a more hands-on approach. You will focus on translating these
functions into the shader context while designing a procedural figure step by step.

As a concrete example, you will build a pirate-style skull using only mathematical functions
and conditional statements, as shown in the following visual reference:

•••

(3.1.a Pirate skull on Desmos)

Note

Some time ago, I wrote a book titled Shaders and Procedural Shapes in Unity 6, which explores
this topic in depth. While the focus was on Unity, the functions and concepts described there
are fully transferable to Godot. If you are interested in learning more about that project, you
can check it out at the following link: https://jettelly.com/store/visualizing-equations-vol-2

If you look closely at Figure 3.1.a, you will notice that the skull is built almost entirely from math-
ematical equations applied to two-dimensional coordinates. These equations — defined
later in this chapter — allow you to delimit visible regions on the screen through comparisons

The Godot Shaders Bible. 165


Procedural Shapes and Vertex Animation.

and operations on the 𝑥 and 𝑦 coordinates. This approach greatly simplifies procedural
design, as each part of the figure is tied to a precise mathematical condition.

Now, where should you start if you wanted to create this figure from scratch? While there
is no single “correct” entry point, you can follow a logical workflow that will help you better
understand UV coordinates and, over time, master two-dimensional design in shaders.

As a first step, you will explore the shape in Desmos, a powerful graphing tool that lets you
experiment with equations and instantly visualize their effects. It is recommended that you
have the following page open as you begin your analysis:

Chevron-Circle-Right https://www.desmos.com/calculator

Note

This book includes a downloadable project containing a folder named case_study, where you
will find more examples of procedural shapes created using Desmos. These additional cases
will allow you to further explore shape construction through equations and experiment with
different strategies for visual modeling based on coordinates.

Assuming you are already inside the Desmos calculator interface, you will now begin de-
veloping the procedural shape step by step. The first step is to declare and initialize the
UV coordinates on the Cartesian plane, which will allow you to work with functions applied
directly to these variables, as shown below:

The Godot Shaders Bible. 166


Procedural Shapes and Vertex Animation.

•••

(3.1.b https://www.desmos.com/calculator/i5ad2cvmqh)

This first step is particularly valuable because it allows you to understand how UV coordinates
behave in a normalized space. From the visualization in Desmos, you can draw the following
observations:

1 The 𝑢 coordinate corresponds directly to the 𝑥-axis.


2 The 𝑣 coordinate correponds to the 𝑦-axis.
3 Both coordinates are constrained between 0.0 and 1.0, which represents the typical UV
space in a fragment shader.
4 To center any shape on the plane, you must subtract 0.5 from each coordinate, effec-

tively shifting the origin to the center of the domain.

With this foundation in place, you are ready to begin designing the general shape of the skull.
As shown in Figure 3.1.a, the head consists of two regions defined by implicit equations:

The first describes the upper part of the skull, represented by a circular equation:

𝑢2 + 𝑣2 < 𝑟2

(3.1.c)

The Godot Shaders Bible. 167


Procedural Shapes and Vertex Animation.

The second corresponds to the chin, modeled as a rectangular deformation with smooth
edges:

min(𝑛1 − |𝑢|, 𝑘1 )4 + min(𝑛2 − |𝑣|, 𝑘2 )4 < 𝑟4

(3.1.d)

You will start by using constant values to set the position and size of both elements on the
plane. This will make it easier to fine-tune them later when combining them into a single
cohesive shape.

•••

(3.1.e https://www.desmos.com/calculator/9kgoilxgz2)

An important detail visible in Figure 3.1.e is the intersection between the two shapes. This
phenomenon is particularly interesting from a computer graphics perspective. Later, when
you translate these mathematical operations into GDSL inside your shader, the visible regions
of each shape will evaluate to 1.0, while the areas outside those regions will evaluate to 0.0.

This means that if you combine both shapes through direct addition, the intersection area
will result in a value of 2.0. This excess can lead to unwanted visual artifacts, such as color or
luminance saturation — especially if these values are being used to control color, opacity, or
light intensity.

The Godot Shaders Bible. 168


Procedural Shapes and Vertex Animation.

•••

(3.1.f Contrast values have been adjusted to represent the difference in values)

Therefore, when combining shapes, you should consider techniques such as clamping
(clamp()), averaging, or even the max() operation, depending on the visual behavior you
want to achieve. We will explore these methods in more detail later.

For now, you will avoid having the two shapes overlap in Desmos by restricting the rectangle’s
area so it does not interfere with the circle. To achieve this, you can limit the domain of the
function using brackets, ensuring that the equation is only applied within a specific range. In
this case, you will constrain the rectangle’s function so that it only operates outside the area
defined by the circle, as shown below:

•••

(3.1.g https://www.desmos.com/calculator/2hytxw2ee6)

The Godot Shaders Bible. 169


Procedural Shapes and Vertex Animation.

Note

Procedural shapes can be demanding on the GPU, so it’s recommended to use them with
caution — especially in production environments for video games. However, understanding
these techniques deepens your control over UV coordinates and equips you with tools to
optimize visual performance by combining textures with mathematical functions.

It’s worth noting that you can also implement these kinds of conditional limits directly in GDSL.
Whether you do so will depend on the desired result, since explicit clipping is not always
necessary if the visual effect does not require it.

Now, let’s move on to the next step in the design: the skull’s eyes. In this case, you will also
use a circular equation, but there’s no need to duplicate it. Thanks to horizontal symmetry,
you can simply apply the function 𝑎𝑏𝑠(𝑢) to automatically mirror the shape, giving you both
eyes with a single mathematical expression.

•••

(3.1.h https://www.desmos.com/calculator/6dbg8pf7vw)

As you can see, the equation you just added corresponds to the same shape defined earlier
in Figure 3.1.c. The key difference is that this time, you applied the absolute value to the 𝑢
coordinate. This small adjustment takes advantage of the object’s horizontal symmetry,
allowing you to represent both eyes with a single operation and thereby optimize the process.

The Godot Shaders Bible. 170


Procedural Shapes and Vertex Animation.

Additionally, if you wish, you can use this new region as a clipping condition for the overall
head shape. This would allow you to create a precise cutout in the skull area, respecting the
construction hierarchy and maintaining control over shape overlaps.

•••

(3.1.i https://www.desmos.com/calculator/qqv6aqwmjp)

With this adjustment, you reduce the size of the eyes to achieve the desired visual result. This
technique allows you to fine-tune visual details with precision, without the need to define
multiple independent shapes.

Next, you can define the nose. One option would be to use a linear function in the form:

𝑣 > 𝑚𝑢 + 𝑏

(3.1.j)

However, this expression describes only a straight line. To model a triangular nose, you need
to define a region bounded by two symmetrical lines that converge at a single point. You
can achieve this in Desmos in two steps:

1 Apply the absolute value to the u coordinate to create horizontal symmetry.


2 Restrict the expression to a specific vertical range to close the shape.

The Godot Shaders Bible. 171


Procedural Shapes and Vertex Animation.

For example:

•••

(3.1.k https://www.desmos.com/calculator/tkw4cieh3u)

This condition generates an inverted triangular region positioned at the lower center of the
head — perfect for representing the nose. By constraining the 𝑣 range, you prevent the shape
from extending beyond what is necessary, keeping it compact and well-controlled.

As shown in Figure 3.1.k, the function’s parameters take the values 𝑚 = 2.0 and 𝑏 = 0.05.
Additionally, the v coordinate has been inverted to tilt the slope downward, and the absolute
value of 𝑢 has been applied to create the horizontal symmetry characteristic of a triangular
skull nose. By adjusting these values, you can control both the width and the vertical position
of the shape.

With the nose defined, you can move on to the pirate bandana. There are multiple ways
to approach this element, but a simple and effective option for this character is to use a
condition based on a square root. Specifically, you can visualize all points where the square
root of 𝑥 is less than 𝑦, which can be expressed as:


𝑥<𝑦

(3.1.l)

The Godot Shaders Bible. 172


Procedural Shapes and Vertex Animation.

This expression produces an upward curve across the face, simulating part of the fabric’s
contour. You can then trim this region using the previously defined areas for the head and
eyes, ensuring that the bandana does not overlap or interfere with other parts of the design.

•••

(3.1.m https://www.desmos.com/calculator/nvkem2ifdl)

As shown in Figure 3.1.m, after applying the square root–based condition, you limited the
result to two regions: the head area and the eye region. This ensures that the pirate bandana
does not overlap important facial features or extend beyond the skull’s edges.

It’s important to note that in this case you are using 𝑥𝑦 coordinates instead of 𝑢𝑣. This
is because Desmos starts 𝑥 and 𝑦 from 0.0, which makes it ideal for directly representing

expressions like 𝑥 < 𝑦 without requiring recentring.

At this point, only a few elements remain to complete the character: the eyebrows and the
crossed bones behind the head. For these, you will introduce a fundamental resource in
computer graphics — two-dimensional rotation using matrices.

You will create a function in Desmos called Rotation, which will take three arguments: the 𝑥
and 𝑦 coordinates of the point to transform, and an angle 𝑎. The rotation is defined as:

𝑅otation (𝑥, 𝑦, 𝑎) = (cos(𝑎) 𝑥 + sin(𝑎) 𝑦, − sin(𝑎) 𝑥 + cos(𝑎) 𝑦)

(3.1.n)

The Godot Shaders Bible. 173


Procedural Shapes and Vertex Animation.

This transformation allows you to rotate any point on the plane, which is particularly useful
for giving expression to the eyebrows or adjusting the orientation of the bones.

For the eyebrows, you will use soft cylinders, defined with the same equation you saw in
Figure 3.1.d, but with lower exponents to create smoother edges. Since you’ll be applying
rotation, you will first initialize a new point to store the rotated 𝑢𝑣 coordinates, and then
define a specific rotation angle for each eyebrow. Separating these variables will let you
reuse the same base shape in different positions and orientations.

•••

(3.1.o https://www.desmos.com/calculator/juc94owkuo)

In Figure 3.1.o, you can see that a constant rotation angle of 45°— represented as 𝑅1 — has
been defined. A new point, 𝑝0 , is then declared and initialized by applying the rotation to the
previously offset 𝑢𝑣 coordinates. This new point replaces the original coordinates, allowing
you to precisely determine the position and orientation of the character’s right eyebrow.

To create the left eyebrow, you can optimize the process using a horizontal reflection. Instead
of rotating a second set of coordinates, you simply declare a new point, 𝑝1 , as the mirror
of 𝑝0 across the vertical axis. This lets you reuse the same base shape and shading logic,
achieving perfect symmetry with minimal computational overhead.

The Godot Shaders Bible. 174


Procedural Shapes and Vertex Animation.

•••

(3.1.p https://www.desmos.com/calculator/oillqby4fv)

With this expression, you define the left eyebrow by reflecting the 𝑢 coordinates to achieve
symmetry without duplicating complex calculations. The point 𝑝1 is obtained by applying
the same rotation used for 𝑝0 , but with the coordinates shifted in the opposite direction. This
way, you generate both eyebrows from a single rotated base shape, improving efficiency
while maintaining visual consistency.

To complete the reference figure, the last step is to add the crossed bones positioned behind
the head. You will follow the same procedure used earlier: first, define a new rotated point,
and then apply it to an equation that produces the desired shape. In this case, you will use a
variation of the cylindrical form, adjusting the minimum values to sculpt the ends and create
a more organic silhouette that closely resembles a bone. The resulting equation is as follows:

The Godot Shaders Bible. 175


Procedural Shapes and Vertex Animation.

•••

(3.1.q https://www.desmos.com/calculator/g9u8f63rdi)

This expression produces an elongated shape with rounded edges and wider ends, resulting
in the characteristic form of a bone. By using 0.04 as the minimum value in the 𝑚𝑖𝑛()
function, you smooth out the lateral cut, avoiding the straight edge of a traditional cylinder
and giving the shape a more refined, stylized appearance.

It’s important to limit the bone’s region using the head’s definition. This ensures the shape
does not visually overlap the skull, maintaining a clean and well-structured composition.

Siguiendo el mismo enfoque anterior, podemos duplicar el hueso aplicando una reflexión
horizontal. Para ello, declaramos un nuevo punto 𝑝3 que representa el reflejo en 𝑥 del punto
𝑝2 , y lo sometemos a la misma transformación de rotación:

Following the same approach as before, you can duplicate the bone by applying a horizontal
reflection. To do this, declare a new point 𝑝3 that represents the reflection of 𝑝2 along the 𝑥
axis, and then apply the same rotation transformation:

The Godot Shaders Bible. 176


Procedural Shapes and Vertex Animation.

•••

(3.1.r https://www.desmos.com/calculator/9m7xkdz2zf)

Additionally, you set an extra condition on the radius (𝑢2 + 𝑣2 > 0.382 ) to ensure that the
bones do not intrude into the head’s space. This restriction keeps the elements separated
and prevents the final result from appearing visually cluttered or overwhelming.

3.2 Variables for Aimation and Procedural Modification.

Up to this point, your pirate skull has been built entirely using fixed coordinates and constant
values. This approach has allowed you to interpret and construct the figure in a controlled
manner. However, if you want to introduce movement or visual variations, you will need to
incorporate variable values that let you manipulate key elements of the design.

To achieve this, you will define a set of properties in the Desmos interface. These variables
will act as control parameters, enabling you to modify the character’s expression in real time.
Below is a description of each one:

Chevron-Circle-Right 𝐴1 : Controls the eyebrow rotation angle. The range is from 0 to 45 degrees.
Chevron-Circle-Right 𝐴2 : Adjusts the rotation angle of the rear bones, with a range from 17 to 45 degrees.
Chevron-Circle-Right 𝑆: Sets the skull’s face size, ranging from 0.30 to 0.38.
Chevron-Circle-Right 𝑃: Moves the face (eyes, eyebrows, and nose) vertically, with a range from 0.0 to 0.1.
Chevron-Circle-Right 𝑇: Controls the character’s eye size, ranging from 0.04 to 0.1.

With these properties defined, you can place them at the top of the left panel in the Desmos
interface, as shown below:

The Godot Shaders Bible. 177


Procedural Shapes and Vertex Animation.

•••

(3.2.a https://www.desmos.com/calculator/yjhn9vpe3j)

Implementing these properties in your equations is a straightforward process. You simply


need to remember the purpose of each variable and replace the constant values with them
in the corresponding expressions.

For example, if you replace the fixed value 45 with the variable 𝐴1 in the constant 𝑅1 , you will
be able to dynamically adjust the eyebrow rotation angle. The same applies to the constant
𝑅2 — by replacing its value 45 with 𝐴2 , you can control the rotation angle of the rear bones
in real time.

•••

(3.2.b https://www.desmos.com/calculator/v9krsdd1h2)

The Godot Shaders Bible. 178


Procedural Shapes and Vertex Animation.

Let’s move on to the 𝑆 variable, which you will use to scale the head up or down. If you
applied this variable only to the head’s radius, other elements — such as the bandana, jaw,
and rear bones — would remain unchanged, causing visual misalignment. To prevent this,
you need to propagate the 𝑆 variable to other parts of the design. Here’s how to do it:

Chevron-Circle-Right Head: Replace the fixed value 0.35 with 𝑆 so the head’s radius becomes dynamic.
Chevron-Circle-Right Bandana: Apply 𝑆 to the first limit used in constructing the bandana, ensuring its shape
stays within the new scale.
Chevron-Circle-Right Jaw: Replace the value 0.35 in its corresponding limit with 𝑣, and adjust its dimensions
so it scales consistently with the head.
Chevron-Circle-Right Back bones: Replace 0.38 with 𝑆 in both limits, then add 0.03 to maintain their relative
position to the outer edge of the skull.

If all these steps are applied correctly, the resulting expressions should look like this:

•••

(3.2.c https://www.desmos.com/calculator/qcdrrd2jav)

To ensure that the jaw expands along with the head’s radius, you must also adjust its dimen-
sions based on the 𝑆 variable. Specifically, you will modify both the 𝑢 and 𝑣 coordinates
within the original equation so that its size and position adapt dynamically to the head’s
growth.

The Godot Shaders Bible. 179


Procedural Shapes and Vertex Animation.

The adjusted expression is as follows:

•••

(3.2.d https://www.desmos.com/calculator/zay3r7ynov)

Here, the amount 𝑆 − 0.23 defines the new maximum width allowed for the jaw, while
𝑆 − 0.15 adjusts its relative height within the face. By applying these values, the jaw grows
proportionally with the head, preserving the original shape and avoiding visual misalignment.

The 𝑃 variable will let you modify the vertical position of the facial elements, which is useful
for creating expressive variations or simple animations. To keep the displacement consistent,
you must apply it not only to the eyes, nose, and eyebrows, but also to the limits that define
the face’s contour and the bandana. The necessary changes are as follows:

Chevron-Circle-Right Head: Add 𝑃 to the 𝑣 coordinate to adjust its surrounding visual limits.
Chevron-Circle-Right Eyes: Apply 𝑃 directly to the 𝑣 variable inside the equation.
Chevron-Circle-Right Nose: Subtract 𝑃 from 𝑣 (that is, use −𝑃 − 𝑣) in both the main condition and the
limits to invert the displacement direction, since the nose points downward.
Chevron-Circle-Right Eyebrows: Add 𝑃 to the 𝑦 coordinate of the rotated points 𝑝0 and 𝑝1 to keep them
aligned with the rest of the face.
Chevron-Circle-Right Bandana: Add 𝑃 to the 𝑣 variable inside the second limit used to trim it correctly
according to the eyebrow rotation.

The Godot Shaders Bible. 180


Procedural Shapes and Vertex Animation.

If all these adjustments are applied correctly, your expressions should look as follows:

•••

(3.2.e https://www.desmos.com/calculator/hbet5kdjgp)

Finally, you will use the 𝑇 variable to increase or decrease the character’s eye size. Its
implementation is straightforward: simply integrate it into the radius of the equation that
defines the eyes, multiplying its value by a constant that preserves the original design’s
proportions and aesthetics.

In this case, the chosen base value is 0.5, as shown below:

•••

(3.2.f https://www.desmos.com/calculator/bvccatrkgb)

This approach allows you to dynamically control the eyes’ opening, which can be useful both
for adjusting the character’s expression and for experimenting with more cartoon-like styles.

In the next section, you will implement all these expressions directly in GDSL. To do so, you will
return to Godot, create a shader, assign it to a material, and set up the project structure
once again.

The Godot Shaders Bible. 181


Procedural Shapes and Vertex Animation.

3.3 Implementation of the Procedural Character in GDSL.

In this section, you will implement the mathematical equations developed in the previous
sections to draw your pirate using GDSL. Before starting the coding process, you will organize
the project’s folder structure to maintain an orderly and easily scalable workflow. To do this:

Chevron-Circle-Right Inside the assets folder, create a new subfolder named chapter_03, where you will
store all the resources for this chapter.
Chevron-Circle-Right Inside chapter_03, add another folder named procedural_shape.
Chevron-Circle-Right Finally, inside procedural_shape, create two subfolders:
Chevron-Circle-Right materials.
Chevron-Circle-Right shaders.

For this exercise, you will use only a QuadMesh, which you will set up in the scene using a
Node3D along with a MeshInstance3D. This object will be enough to accurately visualize the
pirate’s shape, making it ideal for this case study.

You will also need a shader and its corresponding material to project the figure onto the
Quad. In the materials folder, right-click and select:

Chevron-Circle-Right Create New > Resource > ShaderMaterial.

For practicality, name it procedural_shape. Then, create the associated shader. In the
shaders folder, right-click and select:

Chevron-Circle-Right Create New > Resource > Shader.

Give it the same name (procedural_shape) to keep a clear and consistent relationship
between both resources.

If you have followed all the steps correctly, your project structure should look like this:

The Godot Shaders Bible. 182


Procedural Shapes and Vertex Animation.

•••

(3.3.a procedural_shape has been included in the project)

Note

Before starting, make sure to assign the procedural_shape shader to its corresponding mate-
rial and then apply this material to the QuadMesh in your scene. This will allow you to visualize
changes in real time as you work through the exercise.

As a first step, you will perform a quick practical test to analyze how UV coordinates behave
in Godot, since their orientation may differ from other languages or environments.

If you go to the fragment stage of your shader and write the following code:

•••
void fragment()
{
vec2 uv = vec2(UV.x, UV.y);
ALBEDO = vec3(uv, 0.0);
}

You will see, directly on the QuadMesh, that the V coordinate (equivalent to the 𝑦-axis in the
Cartesian plane) is inverted, with the origin (0.0) located in the upper-left corner, as shown
below:

The Godot Shaders Bible. 183


Procedural Shapes and Vertex Animation.

•••

(3.3.b The V coordinate points downward)

This behavior, common in languages such as HLSL, poses a problem when working with
procedural shapes. If you use the same equations along with their respective variables and
constants, the character will appear vertically flipped for obvious reasons.

Therefore, the first step is to adjust the V coordinate so that it follows the same orientation as
the 𝑦-axis in the Cartesian plane. This will allow you to maintain visual consistency between
what you design mathematically and what you render on screen.

•••

(3.3.c UV coordinate comparison across languages)

The Godot Shaders Bible. 184


Procedural Shapes and Vertex Animation.

Note

If you have previously worked with Unity, you may have noticed that UV coordinates start in the
lower-left corner, not in the upper-left corner as shown in Figure 3.3.c (based on HLSL). This is
because Unity supports multiple rendering APIs such as Direct3D, OpenGL, Metal, and Vulkan.
To maintain visual consistency across platforms, Unity adopts a unified coordinate system and
automatically flips the V coordinate, making its behavior closer to GLSL rather than HLSL.

To replicate this behavior in Godot, you simply need to invert the vertical coordinate with the
following operation:

•••
void fragment()
{
vec2 uv = vec2(UV.x, 1.0 - UV.y);
ALBEDO = vec3(uv, 0.0);
}

With this correction applied, you can now start drawing the different parts of the pirate skull
directly in your shader. The first step is to declare the properties that will let you animate the
character. While you could reuse the same names defined in Desmos, to improve readability
and code clarity you will adopt a more structured naming scheme, as shown below:

The Godot Shaders Bible. 185


Procedural Shapes and Vertex Animation.

•••
1 shader_type spatial;
2 render_mode unshaded;
3

4 // source : https://www.desmos.com/calculator/bvccatrkgb
5

6 uniform float _EyebrowRotation : hint_range(0.0, 45.0, 0.1); // A1


7 uniform float _BackBoneRotation : hint_range(17.0, 45.0, 0.1); // A2
8 uniform float _HeadSize : hint_range(0.30, 0.38, 0.01); // S
9 uniform float _FacePosition : hint_range(0.0, 0.1, 0.01); // P
10 uniform float _EyeSize : hint_range(0.04, 0.1, 0.01); // T
11

12 uniform vec3 _HeadColor : source_color;


13 uniform vec3 _EyeColor : source_color;
14 uniform vec3 _BackBoneColor : source_color;
15 uniform vec3 _BandanaColor : source_color;
16 uniform vec3 _EyebrowColor : source_color;

What is happening in our code? Let’s break it down:

Chevron-Circle-Right Line 2: You set the shader to unshaded, since the QuadMesh doesn’t need to receive
scene lighting. This simplifies the display of flat shapes and colors.
Chevron-Circle-Right Lines 6 - 10: You declared the same properties used in Desmos ( 𝐴1 , 𝐴2 , 𝑆, 𝑃, 𝑇 ).
However, here they are renamed using a more descriptive style consistent with common
shader conventions, which makes the code easier to read and maintain.
Chevron-Circle-Right Lines 12 - 16: You added several vec3 properties to define custom colors for each part
of the character — such as the head, eyes, bandana, and eyebrows. These variables
act as input parameters that you can modify from the Inspector.

Now that all properties are declared, you’re ready to start drawing the pirate’s head. You’ll
begin with the first of your equations: a centered circle.

Before that, you must make sure to center the UV coordinates. As you’ve seen, Godot’s UVs
go from 0.0 to 1.0, with the origin at the lower-left corner (after manually inverting V). To draw
the figure in the center of the QuadMesh, subtract 0.5 from both components:

The Godot Shaders Bible. 186


Procedural Shapes and Vertex Animation.

•••
22 void fragment()
23 {
24 float u = UV.x - 0.5;
25 float v = (1.0 - UV.y) - 0.5;
26 vec2 uv = vec2(u, v);
27

28 float head = float(uv.x * uv.x + uv.y * uv.y < _HeadSize * _HeadSize);


29

30 ALBEDO = vec3(head);
31 }

As you can see:

Chevron-Circle-Right Lines 24 - 26: You declared a new 2D vector called uv, which contains the centered
coordinates.
Chevron-Circle-Right Line 28: You declared a scalar named head, which represents the result of a centered-
circle equation. This operation checks whether the current point (uv) lies within the
radius defined by _HeadSize. Since the comparison returns a boolean value (true or
false), you explicitly cast it to float so it can be used as a visual output.
Chevron-Circle-Right Line 30: Finally, you assign head to the base color (ALBEDO), allowing you to visualize
the shape on the Quad.

This procedure generates the visual representation of the pirate’s skull, as shown below:

•••

(3.3.d The pirate’s head has been drawn)

The Godot Shaders Bible. 187


Procedural Shapes and Vertex Animation.

It’s worth noting that the initialization of head can be optimized by using the dot product dot()
instead of manually summing the squared components, since both operations produce the
same result:

•••
float head = float(dot(uv, uv) < pow(_HeadSize, 2.0)

With this optimization applied, you can continue with the jaw declaration, which we implement
using an equation composed of two min() functions raised to the fourth power. This shape
creates a rectangular figure with softened edges. The implementation looks as follows:

•••
void fragment()
{

float head = float(dot(uv, uv) < pow(_HeadSize, 2.0));


float jaw = float(pow(min(_HeadSize - 0.23 - abs(u), 0.0), 4.0) +
↪ pow(min(0.12 - abs(v + (_HeadSize - 0.15)), 0.0), 4.0) < pow(0.05,
↪ 4.0));

head = clamp(head + jaw, 0.0, 1.0);

ALBEDO = vec3(head);
}

Finally, you combine both shapes by adding jaw to head and using clamp() to limit the
resulting value between 0.0 and 1.0, thus preventing potential visual artifacts caused by
saturation. This operation allows you to display both the upper skull and the lower jaw as a
single composite figure, preserving the aesthetic defined in Desmos.

The Godot Shaders Bible. 188


Procedural Shapes and Vertex Animation.

•••

(3.3.e The pirate’s jaw has been drawn)

You can now continue with the implementation of the character’s eyes. To achieve the
desired effect, you will need two circles: one for the outer edge of the eye and another for its
interior. To simplify the code and keep it modular, you can encapsulate this logic inside a
custom function, as shown below:

•••
float eyes_shape (vec2 uv, float p, float r, float s)
{
float u_coord = abs(uv.x) - 0.15;
float v_coord = uv.y + p;
return float(u_coord * u_coord + v_coord * v_coord > (r * r) * s);
}

This function represents a shifted circle equation, based on the form described in Figure 3.1.i,
with the difference that here you are using the 𝑃 property (defined earlier in Section 3.2)
to control the eyes’ vertical position relative to the head. Two additional parameters are
included: r, which represents the eye’s base radius, and s, which acts as a scale factor to
control its size.

The Godot Shaders Bible. 189


Procedural Shapes and Vertex Animation.

•••
void fragment()
{

head = clamp(head + jaw, 0.0, 1.0);


head *= eyes_shape (uv, _FacePosition, 0.1, 1.0);
head += 1.0 - eyes_shape (uv, _FacePosition, _EyeSize, 0.5);

ALBEDO = vec3(head);
}

First, you multiply the output of eyes_shape() by the head, using a base radius of 0.1 and a
scale factor of 1.0. This creates the eye socket, ensuring it is clipped within the head area.

Then, using the function again — but with the _EyeSize variable and a scale factor of 0.5 —
you add the eye’s interior. You invert the result so that the final shape appears as a circular
white mass inside the previously defined socket.

The values used here match the parameters previously defined in Desmos, so the visual
result should be practically identical to the reference.

•••

(3.3.f Different eyes layers)

Additionally, since you’ve already integrated the _FacePosition and _EyeSize properties,
you can return to the Inspector in Godot to adjust the eyes’ position and size dynamically,
giving you greater visual flexibility when experimenting with different expressions.

The Godot Shaders Bible. 190


Procedural Shapes and Vertex Animation.

Before continuing with other shapes that make up the character, you need to modify your
code to incorporate the color properties you declared earlier. Until now, the rendered shapes
have used only black or white values (1.0 or 0.0), which is useful for masks but insufficient for
the character’s final render.

If you want each visual component to have its own custom RGB color, do the following:

•••
29 void fragment()
30 {
31 …
34 .
35 float head = float(dot(uv, uv) < pow(_HeadSize, 2.0));
36 float jaw = float(pow(min(_HeadSize - 0.23 - abs(u), 0.0), 4.0)
37 + pow(min(0.12 - abs(v + (_HeadSize - 0.15)), 0.0), 4.0) < pow(0.05,
↪ 4.0));
38 float eyes = 1.0 - eyes_shape(uv, _FacePosition, _EyeSize, 0.5);
39 float nose = 1.0 - float(-_FacePosition - v > 2.0 * abs(u) + 0.05)
40 * float(-_FacePosition - v < 0.15);
41

42 head = clamp(head + jaw, 0.0, 1.0);


43 head *= eyes_shape(uv, _FacePosition, 0.1, 1.0);
44 head *= nose;
45

46 vec3 render_rgb = vec3(0.0);


47 float alpha = 0.0;
48

49 vec3 head_color = _HeadColor * head;


50 vec3 eyes_color = _EyeColor * eyes;
51

52 render_rgb += head_color;
53 render_rgb += eyes_color;
54

55 ALBEDO = render_rgb;
56 }

The Godot Shaders Bible. 191


Procedural Shapes and Vertex Animation.

What’s happening in your code? Let’s break down each part:

Chevron-Circle-Right Line 38: You declare a new scalar named eyes, which stores the eye value generated
earlier (previously added directly to head). You now separate it so you can mask it with
its own color.
Chevron-Circle-Right Lines 39 - 40: You define the nose variable, which represents the triangular nose shape
based on the equation described in Figure 3.1.j. It includes the properties needed to
control its position dynamically.
Chevron-Circle-Right Line 44: You multiply head by nose to visually subtract the nose from the head, creating
a hole in its place.
Chevron-Circle-Right Lines 46 - 47: You declare render_rgb, an RGB vector that will accumulate the frag-
ment’s final color, and alpha, a scalar to control the shader’s transparency.
Chevron-Circle-Right Line 49: You create head_color, an RGB vector containing the defined color for the
head, and later multiply it by the head mask.
Chevron-Circle-Right Line 50: You create eyes_color, which contains the eye color, and mask it with the eyes
variable.
Chevron-Circle-Right Lines 52 - 53: You add head_color and eyes_color to the render_rgb accumulator.
Chevron-Circle-Right Line 55: You assign the final result to the shader output via ALBEDO.

With this implementation, if you assign custom colors from the Inspector to the _HeadColor
and _EyeColor properties, you will obtain the following visual result:

•••

(3.3.g Skull color: 00d2a6; eyes color: ffa13e)

The Godot Shaders Bible. 192


Procedural Shapes and Vertex Animation.

Next, you will implement the pirate’s bandana. As developed earlier in Desmos, this shape is
composed of a curve that is trimmed in three specific regions:

Chevron-Circle-Right Around the head,


Chevron-Circle-Right below the eyebrows,
Chevron-Circle-Right and along the upper edges of the eyes.

To simplify the process in this first pass, you will focus on the head’s general mask, since it
already contains the necessary constraints to avoid overlapping with the eyes and nose.
The code is as follows:

•••
29 void fragment()
30 {
31 …
40 .
41 float bandana = float(sqrt(UV.x) < 1.0 - UV.y);
42

43 head = clamp(head + jaw, 0.0, 1.0);


44 head *= eyes_shape(uv, _FacePosition, 0.1, 1.0);
45 head *= nose;
46 bandana *= head;
47

48 vec3 render_rgb = vec3(0.0);


49 float alpha = 0.0;
50

51 vec3 head_color = _HeadColor * head;


52 vec3 eyes_color = _EyeColor * eyes;
53 vec3 bandana_color = _BandanaColor * bandana;
54

55 render_rgb += head_color;
56 render_rgb += eyes_color;
57 render_rgb = mix(render_rgb, bandana_color, bandana);
58

59 ALBEDO = render_rgb;
60 }

The Godot Shaders Bible. 193


Procedural Shapes and Vertex Animation.

Let’s break down what’s happening:

Chevron-Circle-Right Line 41: You declare the bandana variable, which defines its shape using the comparison
sqrt(UV.x) < 1.0 - UV.y. This curve crosses the face, matching the design shown
in Desmos.
Chevron-Circle-Right Line 46: You mask bandana with head, ensuring it only renders within the skull’s bound-
aries. Since the head already excludes the eyes and nose, no additional masking is
needed.
Chevron-Circle-Right Line 53: You define bandana_color, an RGB vector that contains the bandana’s color
multiplied by its mask.
Chevron-Circle-Right Line 57: You perform a linear interpolation between the accumulated render_rgb and
bandana_color, using bandana as the weight. This ensures a smooth visual blend
when shapes partially overlap.

If you go back to the Inspector and select a color for the bandana, you will obtain the following
result:

•••

(3.3.h Bandana color: ff7587)

Only the eyebrows and the back bones remain to be implemented. You already know these
elements need to be rotated, so you will define a function that lets you apply rotations to
specific points in space:

The Godot Shaders Bible. 194


Procedural Shapes and Vertex Animation.

•••
vec2 rotation(float x, float y, float a)
{
float ru = cos(a) * x + sin(a) * y;
float rv = -sin(a) * x + cos(a) * y;
return vec2(ru, rv);
}

This function, first introduced in Section 3.1, lets you rotate a two-dimensional point by an
angle a expressed in radians. Although you could optimize the x and y parameters by using
a single vec2, they are kept separate to maintain a more direct correspondence with the
original function shown in Figure 3.1.n, making its mathematical analysis easier.

Once implemented, you can use this function inside the fragment to rotate key points of
the character. In this case, you will start with the right eyebrow. However, since you need
to represent both eyebrows — left and right — you will define a new method that lets you
determine their shape in a reusable way. Proceed as follows:

•••
float eyebrow_shape(vec2 p)
{
float r = 0.03;
float u = min(0.05 - abs(p.x), 0.0);
float v = p.y;
vec2 uv = vec2(u, v);
return float(dot(uv, uv) < r * r);
}

The eyebrow_shape() function is a direct translation into GDSL of the form shown in Figure
3.1.o. In this case, you use the expression dot(uv, uv) to compute the squared magnitude
of the uv vector, which is an optimized way to check whether a point lies within a radius r.

The Godot Shaders Bible. 195


Procedural Shapes and Vertex Animation.

Note

There are several ways to express this operation: (1) x * x + y * y, (2)


pow(x, 2.0) + pow(y, 2.0), and (3) dot(uv, uv). Of the three, dot() is the most
efficient and readable, which is why it is used here to define the eyebrow’s curved shape.

At this point, you will start with the right eyebrow, which will be evaluated using the
eyebrow_shape() method, as shown below:

•••
45 void fragment()
46 {
47 …
58 .
59 float r1 = _EyebrowRotation * (PI / 180.0);
60 vec2 p0 = rotation(u - 0.1, v - 0.15 + _FacePosition, r1);
61 float eyebrow_r = eyebrow_shape(p0);
62

63 head = clamp(head + jaw, 0.0, 1.0);


64 head *= eyes_shape(uv, _FacePosition, 0.1, 1.0);
65 head *= nose;
66 bandana *= head;
67

68 vec3 render_rgb = vec3(0.0);


69 float alpha = 0.0;
70

71 vec3 head_color = _HeadColor * head;


72 vec3 eyes_color = _EyeColor * eyes;
73 vec3 bandana_color = _BandanaColor * bandana;
74 vec3 eyebrows_color = _EyebrowColor * eyebrow_r;
75

76 render_rgb += head_color;
77 render_rgb += eyes_color;
78 render_rgb = mix(render_rgb, bandana_color, bandana);
79 render_rgb = mix(render_rgb, eyebrows_color, eyebrow_r);
80

81 ALBEDO = render_rgb;
82 }

The Godot Shaders Bible. 196


Procedural Shapes and Vertex Animation.

Let’s break down what’s happening:

Chevron-Circle-Right Line 59: The value of _EyebrowRotation is converted from degrees to radians, and the
result is stored in the r1 variable.
Chevron-Circle-Right Line 60: A prior translation is applied to place the eyebrow’s local origin with an offset in
u and v, plus the dynamic vertical adjustment via _FacePosition. Then the rotation()
function is applied using the r1 angle. The result is stored in the p0 vector.
Chevron-Circle-Right Line 61: A new scalar named eyebrow_r is declared and initialized, corresponding to
the pirate’s right eyebrow.
Chevron-Circle-Right Line 74: A new RGB vector named eyebrows_color is declared and initialized, multiply-
ing the eyebrow color by the eyebrow_r mask so that color is applied only to the pixels
where the eyebrow shape is present.
Chevron-Circle-Right Line 79: A linear interpolation is performed between the current color accumulated in
render_rgb and the eyebrow color eyebrows_color.

Once the changes are saved, by selecting a color for the _EyebrowColor property in the
Inspector, you can clearly see the right eyebrow rendered on the QuadMesh.

Next, you will add the left eyebrow using a horizontal reflection, following the same principle
previously applied in Desmos and revisited in Figure 3.1.p. To do this, return to the code and
perform a symmetric operation to the one used for the right eyebrow.

The Godot Shaders Bible. 197


Procedural Shapes and Vertex Animation.

•••
void fragment()
{

float r1 = _EyebrowRotation * (PI / 180.0);


vec2 p0 = rotation(u - 0.1, v - 0.15 + _FacePosition, r1);
vec2 p1 = rotation(-u - 0.1, v - 0.15 + _FacePosition, r1);
float eyebrow_r = eyebrow_shape(p0);
float eyebrow_l = eyebrow_shape(p1);
float eyebrows = eyebrow_l + eyebrow_r;


vec3 eyebrows_color = _EyebrowColor * eyebrows;


render_rgb = mix(render_rgb, eyebrows_color, eyebrows);

ALBEDO = render_rgb;
}

As you can see, a new point named p1 has been declared, which corresponds to the horizontal
reflection of p0 to generate the left eyebrow. The shape of each eyebrow is then evaluated
once again using the eyebrow_shape() function, and the results are stored in eyebrow_r
(right) and eyebrow_l (left). Both values are added and stored in the eyebrows variable,
which represents the combined mask of both eyebrows.

Finally, this mask is used both to generate the corresponding color (eyebrows_color) and to
perform a linear interpolation over the accumulated color (render_rgb), visually integrating
the eyebrows with the rest of the character.

The Godot Shaders Bible. 198


Procedural Shapes and Vertex Animation.

These operations produce the following visual result:

•••

(3.3.i Dark turquoise color: 007a71)

Finally, you will add the back bones. To do this, you will start from the equation shown in
Figure 3.1.q, which defined this shape in Desmos. Its direct translation into GDSL is as follows:

•••
float back_bone_shape(vec2 p)
{
float r = 0.06;
float u = min(0.42 - abs(p.x), 0.04);
float v = abs(p.y) - 0.04;
vec2 uv = vec2(u, v);
return float(dot(uv, uv) < r * r);
}

Next, you will need to rotate both back bones. To do this, apply the rotation() function to
two new points: 𝑝2 and 𝑝3 , which correspond to the right and left bones, respectively. This
procedure is integrated into the fragment stage as follows:

The Godot Shaders Bible. 199


Procedural Shapes and Vertex Animation.

•••
void fragment()
{

float to_radians = PI / 180.0;


float r1 = _EyebrowRotation * to_radians;
float r2 = _BackBoneRotation * to_radians;

vec2 p0 = rotation(u - 0.1, v - 0.15 + _FacePosition, r1);


vec2 p1 = rotation(-u - 0.1, v - 0.15 + _FacePosition, r1);
vec2 p2 = rotation(u, v, r2);
vec2 p3 = rotation(-u, v, r2);


}

If you look closely at the previous code snippet, you’ll notice a new variable named
to_radians, which converts degree values to radians. This conversion is essential for cor-
rectly using trigonometric functions in GDSL, since they expect angles in radians. Thanks to
this constant, you can enter values in degrees from the Inspector, keeping a more intuitive
interface for adjusting rotations.

The rotated values of _EyebrowRotation and _BackBoneRotation are stored in r1 and r2,
respectively. Then, two new two-dimensional vectors are declared: p2 and p3. The first
represents the base point for one of the back bones, while the second is its reflection across
the vertical axis, creating the symmetry needed on both sides of the character.

Once these points are defined, you can use the back_bone_shape() function to draw the
bones in the shader. However, before doing so, remember that the back bones partially
overlap with the head and that their area is slightly larger, as observed in Figure 3.1.r.

For this reason, it’s a good idea to reuse the head mask as a way to visually limit the bones.
To do this, you will encapsulate the operation that computes the head into a new function,
as shown below:

The Godot Shaders Bible. 200


Procedural Shapes and Vertex Animation.

•••
float head_shape(vec2 uv, float s)
{
return float(dot(uv, uv) < pow(s, 2.0));
}

The head_shape() method returns the same circular mask used to represent the head.
Thanks to this reusable function, you can now directly replace the previous operation with a
clearer, more concise call:

•••
float head = head_shape(uv, _HeadSize);

This line improves code readability and lets you reuse the same logic in other elements that
depend on the head’s shape, such as the back bones.

To generate these bones, you start by evaluating their shape using the rotated points p2
and p3 defined earlier. These points represent the symmetric ends of the back, and they are
evaluated with the back_bone_shape() function as shown below:

The Godot Shaders Bible. 201


Procedural Shapes and Vertex Animation.

•••
void fragment()
{

float eyebrows = eyebrow_l + eyebrow_r;

float back_bone_l = back_bone_shape(p2);


float back_bone_r = back_bone_shape(p3);
float back_bones = clamp(back_bone_l + back_bone_r, 0.0, 1.0);


bandana *= head;
back_bones *= 1.0 - head_shape(uv, _HeadSize + 0.03);


vec3 eyebrows_color = _EyebrowColor * eyebrows;
vec3 back_bone_color = _BackBoneColor * back_bones;


render_rgb += back_bone_color;

ALBEDO = render_rgb;
}

The sum of both evaluations generates the back_bones mask, which represents the two
bones. You apply a clamp() function to ensure the final value stays between 0.0 and 1.0,
preventing interpolation or saturation issues.

Next, you use the head shape as an inverse mask over the bones so they don’t visually
interfere with the character’s face. To do this, you slightly increase the radius and subtract
the mask, as shown in this line of code:

•••
back_bones *= 1.0 - head_shape(uv, _HeadSize + 0.03);

Finally, you compute the corresponding bone color and integrate it into the overall color
composition. The color is applied only where the back_bones mask has value, ensuring the

The Godot Shaders Bible. 202


Procedural Shapes and Vertex Animation.

bones blend harmoniously into the visual composition. When you view the final result on
screen, you should see the complete pirate with all elements correctly masked, rotated, and
colored.

•••

(3.3.j Back bones color: d2da00)

Up to this point, you could consider the pirate complete in terms of shape and color. However,
one essential aspect is still missing to integrate it correctly into any scene: transparency.
Adding transparency in GDSL is a simple, straightforward process.

To achieve this, you will reuse the alpha variable, previously declared as the opacity accu-
mulator. Then, you will add the masks corresponding to the character’s visible elements: the
head (head), the eyes (eyes), and the back bones (back_bones). This composite value will
be used as the alpha channel in your fragment.

The Godot Shaders Bible. 203


Procedural Shapes and Vertex Animation.

The resulting code looks as follows:

•••
void fragment()
{

render_rgb += back_bone_color;

alpha += head;
alpha += eyes;
alpha += back_bones;

float back = back_bone_shape(p2);

ALBEDO = render_rgb;
ALPHA = alpha;
}

Thanks to this cumulative sum, transparency is applied only in the areas where the masks
are present. This allows the rest of the quad to remain completely transparent, removing
any unwanted background around the character.

When you view the result, you will notice that the pirate is now fully integrated with an alpha
channel that outlines only its silhouette. This is particularly useful if you want to project it
onto different scenes or combine it with other visual effects.

The Godot Shaders Bible. 204


Procedural Shapes and Vertex Animation.

•••

(3.3.k Different color configuration for the procedural pirate skull)

The Godot Shaders Bible. 205


Special thanks.

Special thanks.
Austin Hackett | Hang Yeh | Sergiox | Stephen Rice | Herman Garling Coll | Matthew
| Ninanico | Neil Palmere | Shiena Jp | Bohdan “teo” Popov | Numa Schilling | Tm
| Grzegorz Drabikowski | Sean Laplante | Max | Carlos Eduardo Boaro | Sergey
Yaroshuk | Vojtěch Lacina | João Marinheiro | Jacek Adamus | Francis Jasmin |
Antoine Luciani | Sander Vanhove | Tahldon | Raveen Rajadorai | Richard Cobbett
| Adrian Jenkins | Vlad Scp | Hilton Perantunes | Pokemonbrycedixon | John Mandish | Alex |
Daniel Stoica | Ricardo Pitta | Patientlion | Claudio Cth | Swagman | Christian Koch | Francisco
Pereira Alvarado | Kyleharrison | Xxradex | Algio | Daniel Castellanos | Carlos Montes | Thomas
Detoy | Combat Lobster | Adrian Rivero | Ewald Schulte | Punk Flag | Miri N | Gripfastgameworks
| Laurenz | Gabriel Malinowski | Cameron Fernandez | Jes | Adamnakonieczny | James |
Ethan Lucasawesome | Shahid Akhter | Benjamin Russell | John Pitchers | Jonne | Yun Wang
| Mazhiwei | Timurariman | M Gewiss | Chunck Trafagander | Dmytro Hryppa | Patrick Joos
| Joseph Torbett | Jason Peterson | Mako | Norbert Haacks | Mika | Sean Mcintyre | Collin
Myhre | Dottorreed | Morpheus Tsai | Aiastesla | Fabian Ruch | Jan Nothacker | Rémy Tauziac |
Martijn Van Halderen | Yon Montoto | Walpurgis Time | Keating | Andrey Valkov | Thsage | Eric
Wong | Nils Nordmark | Nikita Blizniuk | Joseph Deluca | Francesco Michelini | Christopher Hall
| Jaycoleman | Madlion | Graham Reid | D-hill | Marcanthonycacho | Thomas

Jahtani | Tskarthik | Mutrz | Vasil Genov | Manraaj Nijjar | Kovidomi | Robles | Jesse
Virtanen | Domingo Mesa-maliniak | Frederic Batardy | Matoo | Jiří “venty” Michel
| Goldennick | Ginton | 박용민 | Hi | María Luisa Carrión | Andrew Meyer | M | Trent L |
Mark H | Crapule | Dorian Nicou | Sam | Suchaaver Chahal | Victor | Andrew Herch
| Boti Beto | Raymond Gibbons | Phil | Thomas Hunt | Constantin Balan | Brandon
Grenier | Abyssallure | Katia Mccarthy | Joakim Karlsson | Dmitrii | Markus Michel | Robert Allen
| Ross | Mr Thomas Graveline | Stephen James | Tiago Ferreira | Anderson Borba | Andreas
Gillberg | Morgan Wesemann | Janusz Tarasewicz | Ben Thedragon | ガンズターン公式 | Saul |
Flyn | Hellno | Karthik Nallan | Rosciuc Bogdan | Big Quail | Matt | Justo Delgado Baudí | Roose
Alexandre | Mangirdas Balciunas | Jettelly | Asylbekz | Roman Koziolek | Alvaro Larre Borges |
Creta Park | V | Nikola Stanojević | This Is Bennyk | Santiago | Alvaro G Lorenzo | Zachnorthcott
| Orion Ironheart | Brandon Farrell | Nicolas | Douglas Ávila | Mª Dolores Lebrecht | Bernard
Cloutier | Isaac Iverson | Matheus V | Christophe Brenon | Dave | Vtalyh | Jacopo Galati | K G H
| Sam Van Berlo | Joshua | Šarūnas Ramonas | Tamego | Emerson Santos | Ashely | Zoe Firi |
Liyizhanguk | Aleksander G Solheim | Ian | Michael Belknap | Mattias | Fu Ruru | Karim Matrah |
Zhou Kaiwen | Quentin Delvallet | Mmdrewind

The Godot Shaders Bible. 206


Special thanks.

Emmanuel | Someone | Maxie | Patrick Exner | Ivan Ho | Jesse Fletcher | Chris


Morgan | Carlos Eduardo Pérez Villanueva | William Brendaw | Swampnrd | Shader
Snake | S�n Nguy�n Minh | Gerwyn Jones | Seth Foznot | Harry Cassell | Deear
Art | Cillian Clifford | Sabre | M Fatati | Ardscott | Guilherme Cattani | Albert Miro
Montalban | Georges | Kev Zettler | Carsten Bloecher | Clément Vayer | Jose Ignacio Palacios
Ortega | Mcspidey | François De La Taste | Paul | Ross Solomon | Gabo Salcedo | Caigan |
Brad Svercl | Jing Yi Chong | Mykola Morozov | Kalsarin | Zewy | Phoen Leo | Sebastiankmilo
| Hipton | Bawlshy | Tim Arlow | Ffdd | Maksim Loboda | Colin | Lisandro Lorea | Björn Wilke |
Robert Mahon | Dave Lecompte | Pawel Antoniuk | Carlos Aguirre Vozmediano | Eliott Chillag |
Kyle Hessel | Jose Angel Canabal Delgado | Benjamin Oesterle | Ngo Viet Luan | Meleg Zoltán
| Michael Jones | Kevin Krause | Daniel Cavazos | Travis Womack | Ryan Hinds | Dewey Mowris
| No Yes | Nigel Che | Bass | Juanye | Britt Selvitelle | Tor | Noah | Stuart Carnie | Kamwade |
Smedstadc | Anton | Js | Mads Sønderstrup | Steffenschmidt | Tor-arne Nordmo | Алексей
Акулов | Bioplant | Lucas Felix | Natevc | Nathan | Kangweon Jeong | Mei Yamasaki | Meownoi
A | Nathanrvaughn | Monica | Lighthalzenxii | Nike | Mapinher | Xavier Shaver | Jack Lueyo |
James Clancey | Thomas Guyamier | Alfred Reinold Baudisch | Max | Default | Ben Mckenna

Ondřej Kadlec | Tate Sliwa | Dt | Marino | Calvin M Evans | Alen Lapidis | Thoughton
| Mariano | Daquan Johnson | Eli Greenwald | Adam Gulacsi | Nea Pocem | Liming
Liu | Michael Birtwistle | Tobias Brandner | Richard Pearson | Dimitri Bauer | Paulo
Poiati | Edwin | Chauncey Hoover | João Luís Reis | Amy Haber | Corentin Lallier |
Alvinanda Purnomo | Kerry K | Ana Greengrass | Juno Nupta | Israel | Aaron B | Kris
| Gabe Volpe | Keita Fukuchi | Clémentine Ducournau | Gustavo León | Haydenvanearden |
Jdon Leininger | Bohdan | Develop | Caitlin Cooke | Alexandre Schenk-lessard | Grãodopão |
Terencep Dixon | Anatol Bogun | Zammecas | Benji Lising | Kevin Abellan | Ryan T | William |
John Daffue | Simone Sorrentino | Craig Spivack | Pavol Gocik | Clpereira | Alejandro Vera |
Yb Park | Pyrat | Acorzomate | Oniric View Sl | Tulasi Rao | Patricioland | Xanroticon | Nicolas
Delbaer | Jordan | Thom | Findemor | Amasten Ameziani | Patrick | Kacey Walsh | Nathakorn
Throngtrairat | Christian Snodgrass | Emmanuel Bouis | Pokkefe | Mike Klingbiel | Drew Herbert
| Ross Rothenstine | Rhenn | Talal Almardoud | Vinicius Hoyer | Matti Pohjanvirta | Tuomas
Ilomäki | Elias Manzaneda Santamaria | Daniel Canham | Liam Cornbill | Daniël Michael Kracht
| Eliud De Leon | Sean Welgemoed | Pangake | Jules Fijolek | Krystof Klestil | Luc Magitem | Thijs
Tak | Adam Spivey | David Kol | Glenn Barker | Adrián | Gianluca Pumoni | Mal | João Récio |
Jacob Lerche | Daniel R Martinez

The Godot Shaders Bible. 207


Special thanks.

John Pimentel | Kyle Young | Lysandre Marti | Matthew Makary | Craig Kerwin |
Sam Winfield | Alec White | Corentin Lecroq | Richard | Alex | Taylor Segura Vindas
| Erik | Ashley Moon | Simke Nys | Spencer Hoban | Nichas Brennan | Efrian Vega
Jr | Sam | Gato | Thomas D | April Clark | Jakub | Colin | Lockyer Road | William
Hoobler | Kenneth Holston | Brian | Carter Layton | Martin Voth | Brett Buckner | Joba López |
Llynx | Panupat Chongstitwattana | Bryan English | William N Kentris | Penny!! | Minoqi | Anton
Petersson | Aaron J Schneider | Nate Moore | Colin Miller | Edgar | Sam Blackman | Frank
Gusche | Gillian Monte | Eric Alvarez | Viktor Sladek | Ian | Paul Cloete | Ben | Ppsi | William
Beatty | Photonphighter | Arman Frasier | Farhad Yusufi | Lee | Andrew Lee | Codebycandle |
Łukasz K | Alfie | Jeff Zimmer | Game Log | Aaron Imrie | James Karavakis | Dylan Glow | Miles
Harris | Kris | Reavenheart | Benjamin Schmitt | Frodo | Tom Kertels | Connor R | Nathan | Gary
Johnson | Kein Hilario | Blake Rain | Bryant Flores | Joseph Sugabo | Eyal Assaf | Lewis Simpson
| Christopher Holt | Bluen | Jayden Jacksteit | Alan Jimenez | Noah Blake | Icoza | Elizabeth |
Alexandra Bradford | Agustin Adducci | Tim | Xuxing | Marcial Santallory | Austin Grimm | Rob
Brown | Odin Ugelstad | Henry Schwenk | James G Pearson | Marco | Danthecardboardman |
Aaron Wu

Gerben Van Der Heijden | Nil Vilas | King Artorias Arthur | Marcos | Andrew Moffat |
Jun Yeo | Matthew Swee Chuan Tan | Bruce Leidl | Mailjjb Jb | Ryan Shetley | J Chris
Druce | Mārtiņš Prelgauskis | Umut Ulutas | Niki | Hyou Sasaki | William Sattanuparp
| Darrin Henein | Sean Pace | David Betancourt | Gabriel Martinez | Miroslav Edler
| Angelina | Zane | James Mcvitty | Eric Roberts | James Jones | Robert Georges |
Terry Wu | Eldaville | Coryo Moore | Joshua Tucker | William Kruspe | Dizzy Moth | Peter Chantler
| Nural | Gustavo Lopes Domaradzki | Quock | Alec Gall | Steven Lenaerts | Mateus S Pereira
| Alex | Andrew Brizendine | Daniel Holmen | Garett Bare | Kevin Crowell | Spaghetmenot |
Michael Simon | Yuval Dekel | Alan K | Aidan Davey | King Artorias Arthur | Dominic Perno |
Anthony Beninati | Sammi Tu | James Steele | Pepijn Stuurwold Van Walsum | Holger Schmidt
| Oliver Gutiérrez | Nova Obrien | Fredrik Hamstad | Jonathan | Thomas M Kidder | Kai | Gk
| Ben Revai | Daniel Leiva | Miles Welsh | Cannon Pharaoh Mcclendon | Emanuel Lamela |
Jānis Alunāns | Michael Radke | Ryan Woods | Torgabor | Reid Hannaford | Mike Barbee |
Aria Ramshaw | Benjamin Paschke | Mathias | Charles Engel | Benjamin Flanagin | Jettelly
Publishing | Antonio Colon | Sac Alanpereira | Patrick Grant | Marquis Jackson | Miguel Angel
Mendez Cruz | Scottie Doria | Ordinary Fox | Daniel Ridao | Jorn Van Denbussche | Shelly Barnes
| Celia Tran | Rem | George | Aleksander | Dimitry Maizik | Dave | Ryanimates | Annabella Rosen
| Colton Wolkins

The Godot Shaders Bible. 208


Special thanks.

Daniel Rodriguez Aires | Matteo Kramer-tamburrino | Albin Lundahl | Luis Puentes


| Andrew Cotter | Dylan Goncalves Martins | Arroyojavier Sanandres | Xury Greer |
Jjolton | Adam | Daniel Carswell | Deear Art | Shawn Dhawan | Daniel Gerep | Steve
Versace | David Dam | Bdilmore | Dima | Josuprasi | Yaramirez | Michael Frederic
| Ruslan Horiuchkin | Nicholas Vosburgh | Chen | Mateusz | Flapp | Komatsuki | Auro | The
Zefan | Michail Maridakis | Sebastian Rangger | Henry Liu | Brandonrconyers | Andreas Becker
| André B Luz | Logan Croley | Daniel Mutis | Ulric Boel | Vitalik | Tom Brewer | Darmaz | Keagen
Bouska | Chance Mcdonald | Bidda | Pedro Perez | Antonio Contreras Arzate | Dan Crowe |
Jeong Hoon Shin | Benjamin F Hajas | Javis Jones | Kateryna | Seongho-son | Alessandro Pilati
| Brian Huqueriza | Hugo Toti | Ronald Casili | Sdc | Joe | Jordan Faas-bush | Eli Makaiwi | Henry
Audubon | Zoey Lome | Jaakko Ojalehto | Matt Odette | James Asumendi | Trevor Harron |
Alexander Abshagen | Maxim Oblogin | Tresceneti | Asher | Aaron | David Tag | Jake Elliott |
Tim | Tobias Cunningham | Ryan | Robert Fraser | Kristian Gibson | Konstantin Kudinov | Lizzy
Moyes | Charlotte Jones | Sethcrawford | Carlosgolfer | Ryan Renna | Roman | Jean-francois
Segretain | Joao Filho | Flowulf | Ben Boyter | Pearshaped | Jacob Yero | Felipe Fausto | Joaquin
Muñiz | Mel Irizarry | George Thompson | Hakan | Gerard Parareda | Roger Bujan | Eric | Jully

Jubimage | Luis Alfredo Figueroa Bracamontes | Chris | Alessandro Talloru | Adam


Gibson | Revo Wu | Aziz Abidi | Lennysioll | Maxim Jourenko | Tobias Holewa | Rxwp |
Darren Larose | Sebastian Karlsen | Pendant Fills | Damian Stähr | Niklas | Xuwaters
| Christian Ahlers | James Coleman | Ryan Moos | Tomáš Adámek | Adriangtuazon
| Cagri Benli | Darren | Guy | Mateusz Wolos | Kristofer | Jonathanrivasmencia
| Joelfreemand | Eric Persson | Saul | Lxyhaqs | Ericpdss | Tấn Phát Huỳnh | Guma | János
Harsányi | Mr Connor S Adams | Florian Stumpf | Wyatt | Adam Helešic | Jakob Sorenson | Nick
| Hannah Petherick | Erik Loide | Bradley | Artbarte | Christophe Delahaye | Cheng Bowen |
Jeremy Kamrath | Amanda Koh | Philip Ewert | Michael Heywood | Tralexium | Maxfield Hewitt | J
G | Chong Ren Li | Max Gratzl | Aaron | Ronald Gibson | Niall Brady | Sam Morgan | David Stewart
| Marcel Gentner | Martin Prazak | Josh Bousfield | Pietro Maggi | Adrian Hernik | Sebastian |
Jonathan Scheer | Alexander Velchev | Kai-fabian Schönauer | Mattia Belletti | Izaac Jordan |
Kyle Donaldson | Daniel Maiorano | Jesse Schramm | Jonas Lund | Peter Massarello | David
Carr | David Dunham | Mario Schilling | Jw | Danni | Antonio Sobrinho | Eric | Kaptain Radish | Z
| Andrew Robyn | Andicus | Jacob | Alek D Flener-satre | Joseph Turnquist | Patrick D Rupp |
Davide | Bandit | Matt | Wataru Ikeda | Andi Doty | Ryan | Joshua Rehling

The Godot Shaders Bible. 209


Special thanks.

Nathan Van Fleet | James Walter | Chris Mcpherson | Pixel Pilgrim Studios | Marcell
Kovacs | Roro | Dominykas Djačenko | Alex Robbins | Ng Game Junk | Arturs Cernuha
| Iramis Valentin | Niko Van Wonterghem | Michael Andrew Revit | Garrett Drinkard
| Andrew Pachuilo | Drak | Kerry Shawcross | Vincent Michael Mooney | Brandon
Wolff | John Laschober | Trevan Haskell | Andrew Greenwood | Dani Fey | Alec Burmeister |
Kieran Lane | Seonhee Lim | Javier Teodoro De Molina Mejías | Keith Crabtree | Damu | Raffaele
Picca | Anthony | Luis Ernesto Torres Santibañez | Adam Webber | Daniel | Mfernandez | Ryan |
Miranda Schweitzer | Taylor Bray | Mark Hubczenko | Fathony Teguh Irawan | Renju Mathew |
Alexander Cruz | Tyler Smith | Brad | Austin Baldwin | Jacobarkadie | Kevin Bergstrom | Alex |
Daniel | Matthew Xiong | Quinn Collins | Abbey Howard | Michael Hrabánek | Vegard | Harrison
Grant | Adrien Baud | Sa Sa | Raf Van Baelen | Santitham | Marianne Stalter | Terrence Coy |
Jodie | Nicholas Jainschigg | Aerton | Bee | Trevor Starick | Michael | Matej Šulek | River Vondi
| Jacob | Sam Shockey | Florencia Sanchez | Ezequiel Selva | Cody Keller | David Yaeger |
Pewbop | Gabriel Benjamin Valdez De León | Ask Skivdal | Gautam Dey | Vinicius Nakamura
| Semyon | Bas | Felix Bytow | Adam Reid | Clement | Kirill | Bawm | Hai Bo Wang | Anthony
Leblanc | Duncan | Étienne Guerette | Wesley “Sky” Nworisa | Luca Vazzano | R Wijdenes | Tyler
Hamilton | Ben Witte | Morchaint | Elliott | Alex Wang | Ethan

Jochen Weidner | George Dimopoulos | Christopher Margroff | Milan Koyš | Alexan-


dria Lee | Sterling | Samir Alajmovic | David Devcic | Shane Gadsby | Murilo Gama |
Max Daniels | Dylmm | James Minshull | Cory Kennedy-darby | Jeff T Klouda | Amit
Feldman | Florian Frankenberger | Stefan Maier | Barbara Zacharewicz | Pablo
Mansanet | Francis D Cleary | Joseph Krueger | Morgan Hezon | Benjamin Reber |
Jonathan Tippett | Danny Eisenga | Michael Laplante | Matthew Arabo | Ryan Reed | Ludger
Meyer-wuelfing | Raji | Thefre | Amani A | Iván D | Adrian | Max Olin | Ruben Rodriguez Torres |
Eidel Odiseo Giménez | Kay | Daniel Steven Sarmiento Santos | Kevin Do | Anthony Tarr | Adam
Slavik | Alberto Flor | Jared Moore | Hendrik Poettker | Eric Lugn | Zack | Warp Vessel | Tommi
Shelton | Earl Braxton Mckenna | Adrien Farfals | Jonathan Cater | Luke | Matteo Stefanini |
Oschijns | Keo Daniel Bun | Kevin Young | Hardtrip | Luc-frederic Langis | Pj Palomaki | Laurent
Tourte | Geowarin | Barmy | Khairi Harris | Nick Strasky | Brent Lovatt | Alexander Graham |
M Koray Duman | Houman Jafarnia | Cameron Oehler | Cesar | Justin Doornbos | Tj Morse
| Jeavh | Street Nw | Pete | Ross | Sophie Freeman | Gerrit Schimpf | Brody Wilton | Deniz |
Matthew Laurenson | Nicholas Venditti | Lucas Long | Jardian Halliday | Ahmad Takhimi Bin
Ahmad Mahayuddin | Joe Bechtel | Delbert Vick | Stefan Larsen | Jollymeatball | Kelita Nolan |
Lily Jin | Terence Steabner | Daniel Hall | Elia Théo | Matthew Brennan | Razvan | Ditherdream |
Sjoerd Van Kampen

The Godot Shaders Bible. 210


Jettelly wishes you success
in your professional career.

You might also like