Make it Rain

Here’s a brief tutorial about how I made one of my first shaders. I’ve broken it down and explained every step, but if you’re not a programmer some of the terminology might trip you up. I’d recommend Makin’ stuff look good YouTube tutorials for a more intuitive introduction.

 Here’s how I did it.

What I wanted was a shader that would take a random noise texture and render only the pixels that had colour values above a customizable threshold value and set their colour to be another customizable colour. The rest would be invisible. Then I needed to be able to tile and offset the noise texture to give the impression of movement.

Shader "Rain" {
    Properties{
         
    }

    SubShader{
        Tags{

        }

        Pass{
            CGPROGRAM

            ENDCG
        }
    }
}

This is our empty shader, the tags CGPROGRAM and ENDCG define the start and end of the actual shader code, the rest is so Unity’s ShaderLab can interpret it properly.

Shader "Rain"{ tells Unity what this shader’s name is, we can also organise our shaders alongside Unity’s default shaders by mimicking a path name. For example Shader "Custom/Weather Effects/Rain" { will place our shader in a Custom and Weather Effects folder in Unity’s shader drop down list.

The Properties block tells Unity to show the things here in the editor, this is where you add your texture and anything else your shader needs. You might be familiar with these as the material settings, as that’s where they appear in the Unity editor, but they belong to the shader.

The SubShader block allows you to define multiple types of the shader. If you have a particlualrly expensive shader, you can define multiple SubShaders and the player’s graphics card will go down the list and use the fisrt SubShader block that works.

The Tags block is used to set the rendering order and some other parameters of a SubShader. It’s a set key-value pairs, for more info check here.

The Pass block tells Unity to do a rendering run with the shader code inside, each SubShader can have multiple Pass blocks but keep in mind that each Pass is a run on the GPU if you can achieve what you want from a shader in less Pass-es it’s almost always better to do so.

Before we can start rendering stuff we need to define a few data types in our shader.

CGPROGRAM

struct appdata {
    float4 vertex : POSITION;
    float2 uv : TEXCOORD0;
}

struct v2f {
    float4 vertex : SV_POSITION;
    float2 uv : TEXCOORD0;
}

ENDCG

Why we’ve done this will become clear in a little bit, first I want to explain what we’ve done. The first struct has two properties of type float4 and float2. A float4 is another data structure, it holds four float values, whereas a float2, you guessed it, holds 2. We’ve also put some other stuff after are variable types and names, POSITION and TEXCOORD0 are keywords that let the computer know what values to fill these with. TEXCOORD0 tells the computer to use the first set of uv coordinates TEXCOORD1, TEXCOORD2, TEXCOORD3 represent the others. These can be float2, float3 or float4, but we only want the x,y. POSITION represents the position of the vertex, simple. But why is it float4? The reason is that it is bundled with an extra variable which is the clipping plane, don’t worry about it. The second struct consists of the same data but using SV_POSITION instead of POSITION, the reason is a boring one. This type is used to be compatible with Playstation and a few other platforms.

#pragma vertex vert
#pragma fragment frag

v2f vert (appdata INvertex) {
    
}

float4 frag (v2f INfragment) : SV_TARGET {
    
}
ENDCG

Ok, this is why we did the stuff we did, so we can use it in these functions. The #pragma tags tell the computer what we’re expecting in the functions we’ve made. The SV_TARGET tag tells the computer that this function will return the colour for this pixel so it can stop and move to the next pixel. The vert function is used to translate a point in the game to a point on the screen and the frag function chooses what the colour should be for the pixel.

v2f vert (appdata INvertex) {
    v2f output;
    output.vertex = mul(UNITY_MATRIX_MVP, INvertex.vertex);
    output.uv = INvertex.uv;
    return output;
}

float4 frag (v2f INfragment) : SV_TARGET {
    return float4(1,1,1,1);
}
ENDCG

Let’s start with the frag function. It simply returns a white colour. The vert function looks a bit more complex, but isn’t really. It’s just preparing the output, the uvcoordinates for the fragment and the vertex are the same, but the position needs to change from local position to screen position. There’s a pre-made matrix that we can use to transform the vertex we get from the computer. The UNITY_MATRIX_MVP transforms the coordinates first from local coordinates to world coordinates (model), then to camera coordinates (view), then manipulates it to fit the projection (perspective). The mul function applies a multiplication – remember that matrix multiplication is not commutative so the order of the variables matter.

Congratulations you’ve just written your first shader. Maybe you’d like to make it a bit more interesting?

Properties {
    _CustomColor("Colour", Color) = (1,1,1,1)
}
float4 _CustomColor;
float4 frag (v2f INfragment) : SV_TARGET {
    return _CustomColor;
}

We’ve done a few things here, We’ve defined an editor field for the variable _CustomColor with the editor name "Colour" and the type Color which will default to white. We’ve also linked the variable _CustomColor in out Pass to the one in Properties by defining it again. And finally we return the _CustomColor instead of white in the frag function.

Properties {
    _CustomColor("Colour", Color) = (1,1,1,1)
    _MainTex("Noise Texture", 2D) = white { }
}
sampler2D _MainTex;
float4 _CustomColor;
float4 frag (v2f INfragment) : SV_TARGET {
    float4 noiseColor = tex2D(_MainTex, INfragment.uv);
    return noiseColor;
}

noise

Here we’ve added the noise texture, I’m using this one. Again we’ve defined our texture in the Properties, we’ve called it _MainTex because that’s a standard name for the main texture your shader uses, and Unity has some function that use this. Again we’ve linked _MainTex in our Pass. The tex2D function takes a texture and a uv coordinate and outputs the colour of the texture at those coordinates. This should look exactly like our noise texture.

Properties {
    _CustomColor("Colour", Color) = (1,1,1,1)
    _MainTex("Noise Texture", 2D) = white { }
    _NoiseThreshold("Intensity", Range(0,1)) = 0
}
sampler2D _MainTex;
float4 _CustomColor;
float4 _NoiseThreshold;
float4 frag (v2f INfragment) : SV_TARGET {
    float4 noiseColor = tex2D(_MainTex, INfragment.uv);
    
    clip(_NoiseThreshold - noiseColor.rgb);

    return _CustomColor;
}

Let’s go through this bit by bit. We’ve added a new variable in Properties the _NoiseThreshold is the proportion of the pixel that we will render. Because it’s a proportion we’ve constrained it’s values between 0 and 1 using the Range(0,1) type. We’ve linked the threshold in the Pass and then we use it in this clip function. What clip does is if the value passed in is less that 0 then it discards the pixel. You can read the documentation here. We’ve also replaced the returned colour with the _CustomColor again, this time only the pixels that aren’t clipped will be rendered.

float4 _MainTex_ST;
v2f vert (appdata INvertex) {
    v2f output;
    output.vertex = mul(UNITY_MATRIX_MVP, INvertex.vertex);
    output.uv = INvertex.uv * _MainTex_ST.xy + _MainTex_ST.zw;
    return output;
}

Now we’re getting somewhere! Adding the tiling and offset values from the texture in the material is pretty easy. For every texture given to the shader Unity makes a float4 which holds the scale and the translation of the texture. They call it [texture name]_ST. We still need to link it in Pass but we don’t need to define it in Properties. We can now transform the uv coordinates by multiplying them by the scale (_MainTex_ST.xy) and adding the translation (_MainTex_ST.zw). If the names of the variables are confusing it’s because they are, xy is the first two as a float2 and zw is the second pair.

So now you can mess around with the values of the shader we’ve defined in Properties including the tiling and offset values. I’ll put the whole shader below along with some examples. But first some links to good resources or tutorials on the subject.

Here’s the shader.

Shader "Custom/Weather Effects/Rain" {

	Properties{
		_MainTex("Noise Texture", 2D) = "white" { }
		_CustomColor("Noise Color", Color) = (1,1,1,1)
		_NoiseThreshold("Intensity", Range(0, 1)) = 0
	}

	SubShader{
		//We didn't use a Tag in this shader
		Pass{
			CGPROGRAM
			
			//define the functions
			#pragma vertex vert
			#pragma fragment frag

			//vertex structure
			struct appdata {
				float4 vertex : POSITION;
				float2 uv : TEXCOORD0;
			};
			
			//fragment structure
			struct v2f {
				float2 uv : TEXCOORD0;
				float4 vertex : SV_POSITION;
			};
	
			//linking definitions
			sampler2D _MainTex;
			float4 _MainTex_ST;
			float4 _CustomColor;
			float _NoiseThreshold;
			
			v2f vert(appdata INvertex) {
				v2f output;
				output.vertex = mul(UNITY_MATRIX_MVP, INvertex.vertex);		//transform to screen
				output.uv = INvertex.uv *_MainTex_ST.xy + _MainTex_ST.zw;	//allow tiling and offset
				return output;
			}
	
			float4 frag(v2f INfragment) : SV_Target{
				float4 noise = tex2D(_MainTex, INfragment.uv);	//get noise value

				clip(_NoiseThreshold - noise.rgb);				//discard pixel if too low
				
				return _CustomColor;							//use uniform colour
			}

			ENDCG
		}
	}
}

And some examples.

Palette Swap Shader

A palette swap is a technique used in games. It is the computational equivalent of colour by numbers. In colour by numbers you get a page with some numbers identifying a series of shapes, and a collection of little paint pots with numbers on the lids. The numbers tell you what shape should be which colour. The computer does the same thing with the palette swap. The computer wants to render a picture, each pixel of this picture has a colour code. (An RGB number is 3 numbers that identify a number in terms of how much of Red, Green and Blue to mix) The computer takes the colour code to the palette and the palette exchanges that number for a new number and, in doing so, swaps the colour that would have been rendered. This technique was initially developed to be able to show a lot of different types of characters using only one texture. This allowed developers to save a lot of space in the memory and has been used ever since.

The reason I’ve been developing a palette swapping shader is for use in my gravity inversion game. I found it quite difficult to sense which direction gravity was pointing, kind of the whole point, so I decided to completely change the colours of everything on the screen while gravity was inverted. This let me quickly change the colour of all of the textures rendered in my game by only changing a small 16 pixel sprite. I found that using this didn’t cause any noticeable slow down, even on mobile devices. Then I came to the idea that, if the palettes are small and easy to produce, why not let the player choose which colour schemes they would like.

I’m developing my game in Unity2D, so how did I manage it? I give two textures to the shader. One is the original texture, the other is the palette. I then need to convert the colour of the first texture into the coordinates of the second. I decided to use the number for the red composition only and then use that value of between 0 and 1 to translate to how far along the palette to go. Once I had the new colour I was nearly in the clear. I also wanted to preserve the transparency of the original texture, so after changing the render type from opaque I copy the old alpha (transparency) value to the new colour. I’m using textures not sprites to preserve texture scrolling for my parallax background.

After putting it all together, I’ve found it works well. All I have to do is send an event to the ColourInverter script I wrote which accesses the shader properties and swaps out the current palette. I decided to do it this way as I found the shader if statement to be less efficient than changing the palette texture (I haven’t tested this on low end phones.)