Skip to main content

Pixelation Shader - Unity Shader

Pixelation Shader

This is the correct way (one of many) of showing pixelation as a post-processing effect.
This effect will work in any aspect ratio without any pixel size scaling issues as well as it is very minimal in terms of coding it up.
image effect that pixelates the screen
pixelate image effect in action

In order to get this to work 2 components have to be set up:
1) The pixelation image effect
2) The script - which will be attached to the camera

So let's get started by creating a new image effect shader.
We will take a look at our Shaderlab properties :
_MainTex("Texture", 2D) = "white" {}
That's it, Everything else will be private and not shown in the editor.
Now we will see what are defined along with the _MainTex but are private.
sampler2D _MainTex;
int _PixelDensity;
float2 _AspectRatioMultiplier;
We will pass _PixelDensity & _AspectRatioMultiplier values from the script.
As this is an image effect there is no need to play around with the vertex shader.
Let's take a look at our fragment shader:
fixed4 frag (v2f i) : SV_Target
   float2 pixelScaling = _PixelDensity * _AspectRatioMultiplier;
   i.uv = round(i.uv * pixelScaling)/ pixelScaling;
   return tex2D(_MainTex, i.uv);
That's it ? 😒
Well.... ya... But still, let me break it down.
Firstly let us take out the aspect ratio correction stuff :
i.uv = round(i.uv * 100.0)/ 100.0;
If you replace the previous i.uv statement with this one you will still see that the image is pixelated but if your game aspect ratio is anything other than a square then it will stretch horizontally if the game window width is more than it's height and vise-versa.
Now let's get into the meat of it :
The round function basically converts a float like 0.6 to 1 and 0.4 to 0; It converts a float with a decimal point to the closest integer.
The uv.x & uv.y values go from 0 to 1. So if you multiply uv co-ordinate (0.125, 0.125) by 100 then you will end up with (12.5, 12.5) and then round it, you will get (12, 12) then you bring it back into the 0 to 1 range by dividing by the same amount; Now it's (0.12, 0.12). These same result will be for uv co-ordinates (0.126, 0.126), (0.129, 0.129) etc...
So the output color will keep taking color input from the same uv co-ordinate. That's how the pixelation effect is achieved.
We will now see how the aspect ratio is corrected in the C# script.
using UnityEngine;

[ExecuteInEditMode, RequireComponent(typeof(Camera))]
public class PixelateImageEffect : MonoBehaviour
    public Material material;
    public int pixelDensity = 80;

    private void OnRenderImage(RenderTexture source, RenderTexture destination)
        Vector2 aspectRatioData;
        if (Screen.height > Screen.width)
            aspectRatioData = new Vector2((float)Screen.width / Screen.height, 1);
            aspectRatioData = new Vector2(1, (float)Screen.height / Screen.width);
        material.SetVector("_AspectRatioMultiplier", aspectRatioData);
        material.SetInt("_PixelDensity", pixelDensity);
        Graphics.Blit(source, destination, material);
You might be familiar with the OnRenderImage function if you have been following the blog for a while.
This function runs after everything on screen is rendered and two render textures are passed into it, source & destination.
the source is the input RenderTexture and destination is the output RenderTexture. You perform some changes to the source and apply it on the destination. The Graphics.Blit(source, destination, material) is basically doing just that.

'material' property is the post-porcessing material that was made with the shader.
We pass in the values for the pixel-density and aspect-ratio multiplier with the material.SetInt() & material.SetVector() methods.
Now we determine the value of 'aspectRatioData' which is a Vector2.
The x component of aspectRatioData is the multiplier for x-axis of uv and y component is for y-axis of uv.
if (Screen.height > Screen.width)
    aspectRatioData = new Vector2((float)Screen.width / Screen.height, 1);
    aspectRatioData = new Vector2(1, (float)Screen.height / Screen.width);
So if the height was greater then the x component was the ratio between width and the height and the y component was 1 ( when multiplied will remain same ).
If the width is greater then the y component will be the ratio between height & width and x component will be 1.
I hope you learned something today. Do not forget to share this if you liked it.
Support Bitshift Programmer by leaving a like on Bitshift Programmer Facebook Page and be updated as soon as there is a new blog post.
If you have any questions that you might have about shaders, C# or Unity development in general don't be shy and leave a message on my facebook page or down in the comments.
For the entire source code, go : HERE
For more Shader development goodness, go : HERE
For Unity development tutorials, go : HERE


Gift Wrapping Convex Hull Algorithm With Unity Implementation

Convex Hull Algorithm Convex Hull algorithms are one of those algorithms that keep popping up from time to time in seemingly unrelated fields from big data to image processing to collision detection in physics engines, It seems to be all over the place. Why should you care? Cus you can do magic with it and it seems so simple to implement when you first hear about it, but when you start thinking about it, you will realize why it's not such a straightforward thing to do.
Now that I got you interested (hopefully) and now we will see just what a convex hull is.
As you may have noticed a perimeter was made with the same points that was given and these perimeter points enclose the entire set of points.
Now we have to clear up the term 'Convex'.
Convex means no part of the object is caved inwards or that none of the internal angles made by the points exceed 180 degrees.
In this example of a concave shape internal angles go beyond 180 degrees.
What are those red lines for? Well...…

Shader Optimization Part 1

The process of shader optimization can seem like trial and error... in fact, that's how it is most of the time.
Most of the time shader optimizations could be boiled down to educated guesses because each time a shader gets compiled, the GPU driver of that specific hardware is what converts your code into actual machine code, therefore, the machine code generated will be different for each GPU and the driver itself might perform some optimizations on top of your's which won't be available on another GPU, thereby making it difficult to have a standard way of writing optimal shader code.

So the best way to know for sure is to actually test it on the hardware you are targeting.
With that said, Here are some universal best ways of getting your shader to perform better.😅 Do Calculations On Vertex Shader The most commonly used case for this is lighting, an example would be Gouraud lighting, where lighting calculations are done per vertex but at the loss of quality.

Some calculatio…

Fortnite Procedural Construction Animation Shader

Fortnite Construction Shader This shader is loosely based on the one that was presented by the Fortnite developers in their GDC talk: Inner Working Of Fortnite's Shader-Based Procedural Animations.

 Here is what we will end up with:
This technique requires you to author the 3D model in a certain way, More or less how those Fortnite developers did.
So we need the authored 3D model and the shader that uses data we get from the model to achieve the desired effect.

There are some nuances here and there so make sure you don't miss out on the details.😗
The first step will be preparing the 3D model and putting in the required data. I used Blender 2.79 but any 3D modeling software would do.
3D Model PreparationModel It
Apply Vertex Colors: For the direction of flight
Each color is a component of a vector (x, y, z). This will be considered as local space.
Values range from -1.0 to +1.0 for each component.
Negative values are achieved by using values of less than 0.5 and positive values wi…

Visual Upgrade - Unity Asset Store [ September, 2018 ]

Visual Overhaul - Assets That Turn Mediocre Into Drool Worthy *Note: These assets are not arranged in any way, no comparison is done. They are just really cool assets that will help make your game look at least a bit better. 😁
8. Bat Particles
7.GPU Instancing Animation
6.Colored Pencils
5.Graphics Adapter Pro
4.Sleek Render: Mobile Post Processing
3.1000+ Effects pack
2.COLR - Coloring Redefined

Check out more such Unity Asset Store Gems : HERE
Don't forget to share these wonderful assets with your colleagues and friends.
Support Bitshift Programmer by leaving a like on Bitshift Programmer Facebook Page and be updated as soon as there is a new blog post.
If you have any questions that you might have about shaders, C# or Unity development in general don't be shy and leave a message on my facebook page or down in the comments.

Creating Android Plugins For Unity

Creating Android Plugins For Unity When creating a game / app with Unity which targets a specific platform like Android or iOS it would really handy if you could access some  platform specific features such as native pop-up windows ( modal dialogs ), battery status, vibration access, location access, file system access etc.. 
*Note : The following tutorial assumes that you know a bit about Android development and java in general. Even if you don't know android development I hope this tutorial shows you how easy and rewarding it is to make your own Android plugins. Before we start you should make a folder called 'Plugins' inside the 'Assets' directory and inside that create a 'Android' folder. This is where all our .aar / .jar files go which is basically a .dll file for windows folks.
We will be making a plugin that displays a Toast dialog which consists of a string we pass
from Unity and also a method that performs some math operation and returns the output…