Skip to main content

Advanced Billboard Shader + World-Space UI Support

Advanced Billboard Shader + World-Space UI Support

As you know billboarding is basically a plane with a texture on it that always facing the camera.

These are some examples of what we are going for.

This tutorial is going to be pretty straight-forward and easy to follow along you will learn how to make a billboard shader that not only keeps looking at the camera but also keeps its relative scaling intact.

We will also provide an option to keep it rendered on top of all the other objects in the scene. This will be most useful for world-space UI that needs to be rendered on top of other geometry.

We will be making 2 shaders here,

  1. Modified Default Unlit shader :- This one is a general shader ( easy to modify furthur ).
  2. Modified Default UI shader :- This one supports whatever a UI shader supports along with our billboarding capabilities.

 So let's get started with making the first one. As usual create a new Unlit shader and dive into the properties we need.

Properties
{
_MainTex ("Texture Image", 2D) = "white" {}
_Scaling("Scaling", Float) = 1.0
[Toggle] _KeepConstantScaling("Keep Constant Scaling", Int) = 1
[Enum(RenderOnTop, 0,RenderWithTest, 4)] _ZTest("Render on top", Int) = 1
}

Don't get distracted by those shader attributes ( [xyz] ). These are really useful little statements that help us format our materail editor interface. I will be adding another tutorial which goes through all of them in detail.

Now let's look at how these were defined in the CG PROGRAM.

uniform sampler2D _MainTex;
int _KeepConstantScaling;
float _Scaling;

You might have noticed that the _ZTest property doesn't show up here, That's because it goes in the sub-shader state value and while we are there we have to set some sub-shader tags as well.

SubShader
{
Tags{ "Queue" = "Transparent" "IgnoreProjector" = "True" "RenderType" = "Transparent" "DisableBatching" = "True" }
ZWrite On /*Write to the depth buffer*/
ZTest [_ZTest] /*Replaces _ZTest with either 0 for rendering on top and 4 for regular z-test*/
Blend SrcAlpha OneMinusSrcAlpha /*Set how semi-transparent and transparent objects blend their colour*/
Pass {
 .
 .
 .

*Note :

  • We have set 'Disable Batching' to True. This is because if an object is dynamically batched the vertex input that we get will be in world space and we will be writing the vertex shader with the assumption that the vertex data will be in local space.
  • This shader only works with 'Quad' primitive or any geometry which has vertices in the y-axis. So the default plane will not work.

Time for the Vertex Shader

v2f vert(appdata v)
{
   v2f o;
   /*1*/ float relativeScaler = (_KeepConstantScaling) ? distance(mul(unity_ObjectToWorld, v.vertex), _WorldSpaceCameraPos) : 1;
   /*2*/ float4 viewSpaceOrigin = mul( UNITY_MATRIX_MV, float4( 0.0, 0.0, 0.0, 1.0));
   /*3*/ float4 scaledVertexLocalPos = float4( v.vertex.x, v.vertex.y, 0.0, 0.0) * relativeScaler * _Scaling;
   /*4*/ o.vertex = mul( UNITY_MATRIX_P, viewSpaceOrigin + scaledVertexLocalPos);
   /*5*/ o.uv = v.uv;
}

We will go through each line in detail.

  1. If we have _KeepConstantScaling value as false then we don't apply any relative-scaling. Incase we do apply relative scaling then we convert the vertex position from local to world-space and get it's distance from the camera. We assign it to relativeScaler value.
  2. mul( UNITY_MATRIX_MV, float4( 0.0, 0.0, 0.0, 1.0)), We are transforming the origin in terms of the view co-ordinates and assign it to viewSpaceOrigin.
  3. The vertices gets scaled according to our 'relativeScaler' and '_Scaling' values and assign it to scaledVertexLocalPos.
  4. We then add the viewSpaceOrigin & scaledVertexLocalPos to get our view-space transformed vertex positions. Then we apply our perspective projection by mul( UNITY_MATRIX_P, viewSpaceOrigin + scaledVertexLocalPos)
  5. Assign our uv co-ordinates.

There are no modifications to the fragment shader.

To create a fully UI compatible shader we will use the 'Default-UI' shader that Unity provides us and make the same changes. You can get the Unity provided shaders HERE. Just select 'Built-in' shaders from the download drop-down for your desired Unity version.

The source code for both shaders can be found : HERE

That's it! Hope you learnt something. Support Bitshift Programmer by leaving a like on Bitshift Programmer Facebook Page and be updated as soon as there is a new blog post.
If you have any questions that you might have about shaders or Unity development in general don't be shy and leave a message on my facebook page or down in the comments.
For more Shader development tutorials, go : HERE
For Unity development tutorials, go : HERE

POPULAR POSTS

Fortnite Procedural Construction Animation Shader

Fortnite Construction Shader This shader is loosely based on the one that was presented by the Fortnite developers in their GDC talk: Inner Working Of Fortnite's Shader-Based Procedural Animations.

 Here is what we will end up with:
This technique requires you to author the 3D model in a certain way, More or less how those Fortnite developers did.
So we need the authored 3D model and the shader that uses data we get from the model to achieve the desired effect.

There are some nuances here and there so make sure you don't miss out on the details.😗
The first step will be preparing the 3D model and putting in the required data. I used Blender 2.79 but any 3D modeling software would do.
3D Model PreparationModel It
Apply Vertex Colors: For the direction of flight
Each color is a component of a vector (x, y, z). This will be considered as local space.
Values range from -1.0 to +1.0 for each component.
Negative values are achieved by using values of less than 0.5 and positive values wi…

Get Started With Compute Shaders

Getting Started With Compute Shaders The shader that's a 'Jack of all trades'. Of course, it's the Compute Shader.
These beauties allow your general purpose parallelized code to run on the GPU.
These can be used to do some pretty cool stuff that would be pretty difficult to do otherwise even with multi-threaded CPU code.
When bringing up Compute shaders in the context of video games, It's mostly used in physics simulations and freaky looking particle effects and to a lower extent as a core part of some post-processing effects and render pipeline optimizations such as various culling operations like occlusion culling.

The most important thing about compute shaders is that it allows for more efficient communication from CPU to GPU side and vice-versa. That basically means you can send arbitrary data to the GPU, let it do some work and then read it's output and then do whatever you want with it.
You can now see that how different compute shaders are when compared …