Shaders in Unity: Core Concepts

Shaders are a powerful tool in game development, allowing developers to create rich, visually striking effects that can greatly enhance the realism and immersion of a game. However, they can also be complex and difficult to understand.

We will begin by discussing what shaders are and why they are important in Unity. We will then dive into the basics of Unity's ShaderLab language, including how to structure a basic shader, how to define input and output structures, and how to create a simple diffuse shader. We will then explore more advanced concepts such as using textures and sampling, lighting and shading, and special effects.

We will also provide a list of additional resources for learning more about shaders in Unity. This guide should serve as a starting point for developers who are new to shaders and want to learn more about how to use them effectively in Unity.

What is a shader?

A shader is a small program with a “.shader” extension (e.g. color.shader), which can be used to generate interesting effects in our projects. Inside, it has mathematical calculations and lists of instructions (commands) that allow color processing for each pixel within the area covering an object on our computer screen.

This program allows us to draw elements (using coordinate systems) based on the properties of a polygonal object. The shaders are executed by the GPU since they have a parallel architecture that consists of thousands of small, efficient cores designed to solve tasks simultaneously, while the CPU has been designed for sequential serial processing.

Note that Unity has three types of files associated with shaders. Firstly, we have programs with the “.shader” extension that are capable of compiling in the different types of render pipelines.

Secondly, we have programs with the “.shadergraph” extension that can only compile in either Universal Render Pipeline (URP) or High Definition RP (learn more about URP). In addition, we have files with the “.hlsl” extension that allow us to create customized functions; generally used within a node type called Custom Function, found in Shader Graph.

There is also another type of program with the extension “.cginc” which we will review in detail later on. For now, we will limit ourselves to making the following association: “.cginc” is linked to “.shader” CGPROGRAM, and “.hlsl” is linked to “.shadergraph” HLSLPROGRAM. Knowing this analogy is fundamental because each extension fulfills a different function and is used in specific contexts.

In Unity, there are at least four types of structures defined to generate shaders, among which we can find the combination of vertex shader and fragment shader then surface shader for automatic lighting calculation, and compute shader for more advanced concepts. Each of these structures has described properties and functions that facilitate the compilation process; we can also easily define our operations since the software adds these structures automatically.

Why shaders are so important in Unity?

The importance of shaders in Unity lies in their ability to create highly detailed and complex visual effects. Without shaders, a 3D scene would be rendered in a very basic way, with no reflections, no transparency, and limited lighting options. Shaders, on the other hand, allow developers to create rich and highly detailed environments, with realistic materials and complex lighting. They also play a key role in creating efficient and optimized performance, in terms of rendering, as well.

The shader programming language

There are several popular graphics APIs (Application Programming Interfaces) that are used by programs, such as Unity, to handle graphics for us. Each graphics API has a corresponding shading language:

  • The OpenGL API, a popular cross-platform graphics library, uses a shading language called GLSL (for OpenGL Shading Language).
  • DirectX, which is designed for use on Microsoft’s platforms, uses HLSL (High-Level Shading Language).
  • Cg, a deprecated shading language developed by Nvidia, uses the common feature set of GLSL and HLSL and can cross-compile to each depending on the target hardware.

The shading language is what you write shader code in. A game or game engine will compile shaders written in one of those languages to run on the GPU. Although it is possible to write shaders using any one of GLSL, HLSL, or Cg, modern Unity shaders are written in HLSL.

There is an extra layer to it in Unity. Unity uses a proprietary language called ShaderLab. All code-based shaders in Unity (except Shader Graph and Compute) are written in ShaderLab syntax, and it achieves several aims at once:

  • ShaderLab provides ways to communicate between the Unity Editor, C# scripts, and the underlying shader language.
  • It provides an easy way to override common shader settings. In other game engines, you might need to delve into settings windows or write graphics API code to change blend, clipping, or culling settings, but in Unity, we can write those commands directly in ShaderLab.
  • ShaderLab provides a cascading system that allows us to write several shaders in the same file, and Unity will pick the first compatible shader to run. This means we can write shaders for different hardware or render pipelines and the one that matches up with the user’s hardware and your project’s chosen render pipeline will get picked.

It’ll become a lot easier to understand how this all works with a practical example, so let’s start writing some ShaderLab.

Shader types

To start creating a shader, we must first create a new project in Unity. If you are using Unity Hub it is recommended to create the project in the most recent versions of the software. (e.g. 2020, 2021, or 2022).

We are going to need a 3D template with Built-in RP to facilitate the understanding of the graphics programming language. Once the project has been created, we must right-click on our Project Window (ctrl + 5 or cmd + 5), go to Create, and select the Shader option. Create a new Unlit Shader.

As we can see, there is more than one type of shader, among them, we can find:

  • Standard Surface Shader
  • Unlit Shader
  • Image Effect Shader
  • Compute Shader
  • Ray Tracing Shader

The list of shaders is likely to vary depending on the version of Unity used to create the project. Another variable that could affect the number of shaders that appear in the list would be Shader Graph. If the project were created in Universal RP or High Definition RP, it may have the Shader Graph package included, which increases the number of shaders that can be created.

We will not go into details about this subject for now; since we must understand some concepts before starting with it, we will simply limit ourselves to working with the shaders that come by default in Built-in RP.

Before creating our first shader, we will do a little review of the different types that exist in Unity software.

Standard surface shader

This type of shader is characterized by its code-writing optimization that interacts with a basic lighting model and only works in Built-in RP. If we want to create a shader that interacts with light, we have two options:

  • Use an Unlit Shader and add mathematical functions that allow lighting rendering on the material
  • Or use a Standard Surface Shader which has a basic lighting model that in some cases includes albedo, specular, and diffuse

Unlit shader

The “Lit” word refers to a material affected by illumination, and “Unlit” is the opposite. The unlit Shader refers to the primary color model and will be the base structure that is generally used to create our effects. This type of program, ideal for low-end hardware, has no optimization in its code; therefore, we can see its complete structure and modify it according to our needs. Its main feature is that it works both in Built-in and Scriptable RP.

Image effect shader

It is structurally very similar to an Unlit Shader. Image effects are used mainly in postprocessing effects in Built-in RP and require the function “OnRenderImage” (C#).

Compute shader

This type of program is characterized by running on the graphics card, outside the normal render pipeline, and is structurally very different from the previously mentioned shaders.

Unlike a common shader, its extension is “.compute” and its programming language is HLSL. Compute Shaders are used in specific cases to speed up some game processing.

For more information about the Compute Shader concept, check out the documentation.

Ray tracing shader

Ray tracing is a technique that makes light in video games behave as it does in real life. It works by simulating actual light rays, using an algorithm to trace the path that a beam of light would take in the physical world.

Ray Tracing Shader is a type of experimental program with the extension “.raytrace”. It allows Ray Tracing processing on the GPU. It works only in High Definition RP and has some technical limitations. If we want to work with DXR (DirectX Ray Tracing), we must have at least one GTX 1080 graphics card or equivalent with RTX support, Windows 10 version 1809+, and Unity 2019.3b1 onwards.

We can use this kind of program to replace the “.compute” type shader in processing algorithms for ray-casting, e.g., global illumination, reflections, refraction, or caustic.

Structure of a shader

To analyze its structure, we will use the Unlit Shader we have previously created and call it “simpleColor”. As we already know, this type of shader is a basic color model and does not have great optimization in its code, this will allow us to analyze in-depth its various properties and functions.

When we create a shader for the first time, Unity adds default code to facilitate its compilation process. Within the program, we can find blocks of code structured in such a way that the GPU can interpret them. If we open our simpleColor shader, its structure should look like this:

Shader "Unlit/simpleColor"
{
    Properties
    {
        _MainTex ("Texture", 2D) = "white" {}
    }
    SubShader
    {
        Tags { "RenderType"="Opaque" }
        LOD 100

        Pass
        {
            CGPROGRAM
            #pragma vertex vert
            #pragma fragment frag
            // make fog work
            #pragma multi_compile_fog

            #include "UnityCG.cginc"

            struct appdata
            {
                float4 vertex : POSITION;
                float2 uv : TEXCOORD0;
            };

            struct v2f
            {
                float2 uv : TEXCOORD0;
                UNITY_FOG_COORDS(1)
                float4 vertex : SV_POSITION;
            };

            sampler2D _MainTex;
            float4 _MainTex_ST;

            v2f vert (appdata v)
            {
                v2f o;
                o.vertex = UnityObjectToClipPos(v.vertex);
                o.uv = TRANSFORM_TEX(v.uv, _MainTex);
                UNITY_TRANSFER_FOG(o,o.vertex);
                return o;
            }

            fixed4 frag (v2f i) : SV_Target
            {
                // sample the texture
                fixed4 col = tex2D(_MainTex, i.uv);
                // apply fog
                UNITY_APPLY_FOG(i.fogCoord, col);
                return col;
            }
            ENDCG
        }
    }
}

Likely, we do not fully understand what is happening in the different code blocks from the shader we just created. However, to begin our study, we will pay attention to its general structure:

Shader "InspectorPath/shaderName"
{
    Properties
    {
        // properties in this field
    }

    SubShader
    {
        // SubShader configuration in this field
        Pass
        {
           CGPROGRAM
           // Program Cg - HLSL in this field
           ENDCG
        }
    }

    Fallback "ExampleOtherShader"
}

(The shader structure is the same in both Cg and HLSL, the only thing that changes are the program blocks in Cg and HLSL. Both compile in current versions of Unity for compatibility)

The previous example shows the main structure of a shader. The shader starts with a path in the inspector (InspectorPath) and a name (shaderName), then the properties (e.g. textures, vectors, colors, etc.) after that the SubShader, and at the end of it all is the optional Fallback. Fallback will assign a different shader so that graphics hardware can continue its process if it cannot use this one.

The “inspectorPath” refers to the place where we will select our shader to apply it to a material. This selection is made through the Unity Inspector.

We must remember that we cannot apply a shader directly to a polygonal object, instead, it will have to be done through a previously created material. Our simpleColor shader has the path “Unlit” by default, this means that: from Unity, we must select our material, go to the inspector, search the path Unlit and apply the material called simpleColor.

A structural factor that we must take into consideration is that the GPU will read the program from top to bottom linearly. Therefore if we create a function and position it below the code block where it will be used, the GPU will not be able to read it generating an error in the shader processing.

Let’s do an exercise to understand this concept.

// 1. declare our function
float4 ourFunction()
{
  // your code here...
}

// 2. we use the function
fixed4 frag (v2f i) : SV_Target
{
  // we are using the function here
  float4 f = ourFunction();
  return f;
}

Don't expect this code to run or do anything useful. The snippet is only to demonstrate the position of one function in relation to another.

We will later talk in detail about the structure of a function. For now, the only important thing is that in the previous example, its structure is correct because the function “ourFunction” has been written where the block of code is placed. The GPU will first read the function “our function” and then it will continue to the fragment stage called “frag”.

Let’s look at a different case:

// 2. we use the function
fixed4 frag (v2f i) : SV_Target
{
  // we are using the function here
  float4 f = ourFunction();
  return f;
}

// 1. declare our function
float4 ourFunction()
{
  // your code here...
}

This structure will generate an error because the function “ourFunction” has been written below the code block that is attempting to reference it.

Error and loading shaders

Occasionally, Unity encounters difficulty in rendering objects with their intended shaders. In such instances, Unity employs alternative shaders, such as:

  • The default error shader - Unity renders this type of shader when there’s a problem with that object’s material or shader; for example, if no material is assigned, if the shader doesn’t compile, or if the shader isn’t supported.
  • The loading shader - This type of shader is generated when asynchronous shader compilation is enabled, or in a development build when Shader Live Link support is enabled.
  • The Streaming Virtual Texturing error material - Typically, this Shader is produced when attempting to generate a shader utilizing Streaming Virtual Texturing (SVT) and the texture is not configured properly.

Summary

Congratulations, you now possess the foundational knowledge to create custom Shaders in Unity. Additional resources such as Unity's Shader documentation can help in expanding your understanding. For further information on Unity's Shader Graph, check out this guide. Thank you for sticking around until the end! Please feel free to leave any feedback or questions in the comments.

Recommended posts

We have similar articles. Keep reading!