Skip to main content

Alto's Adventure Style Procedural Surface Generation Part 1


Alto's Adventure Style - Procedural Surface Generation

A Screenshot of Alto's Adventure Gameplay
This game appears to be a strictly 2D game but if you have played it enough you will notice that some of the art assets used look like it's 3D ( I don't know if they are tho ). If you haven't played the game you are missing out on one the most visually pleasing and calming games out there ( There is literally a mode called Zen mode in the game ).
Anyway, I am going to show you how to make a procedural 2D world ( without the trees, buildings and background ) like in Alto's Adventure.
But you may notice I have a plane which is in in the Z-axis giving a depth to the surface which is not there in Alto's Adventure but if you want to know how to do it then that will be in part 2.
To achieve the same effect of Alto's Adventure ( I'm leaving that up to you ) only minimal changes are needed to the code that I am going to explain.
We are going to be using the plane mesh in unity for creating the 2D surface as the plane comes with 10 vertices along both it's width and height.
So modifying them will be easy because we know how many vertices are there on each row of the mesh.
This inside a class called 'SurfaceGenerator', you can call it whatever you want but remember that this class will be used in part 2.
So here are the member variables used:
public bool generateContinuosly = false;  
public bool generateCollider = false;  
[Range(0.1f,50.0f)]  
public float yScaling = 5.0f; 
[Range(0.1f,20.0f)]  
public float detailScaling = 1.0f;  
[HideInInspector]  
public Vector3[] vertices;  
private Mesh mesh;  
Nothing fancy going on, just declaring some variables. The '[ Range( float , float ) ]' is an attribute used to show in the inspector a slider with the values defined in the parameters.
And the '[ HideInInspector ]' attribute prevents public fields from being shown in the inspector.
void Start()  
{  
   mesh = GetComponent<MeshFilter>().mesh;  
   vertices = mesh.vertices;  
}  
void Update()  
{  
   GenerateSurface();  
}  
Nothing fancy in the Start & Update functions either.
Now we will look at the 'GenerateSurface' function.
 void GenerateSurface()  
 {  
    vertices = mesh.vertices;  
    int counter = 0;   
    for (int i = 0; i < 11; i++)  
    {  
      for (int j = 0; j < 11; j++)  
      {  
        MeshCalculate(counter, i);
        counter++;  
      }  
    }  
    mesh.vertices = vertices;  
    mesh.RecalculateBounds();  
    mesh.RecalculateNormals();  
    if (generateCollider)  
    {  
       Destroy(GetComponent<MeshCollider>());  
       MeshCollider collider = gameObject.AddComponent<MeshCollider>();  
       collider.sharedMesh = null;  
       collider.sharedMesh = mesh;  
    }  
}  
Now we will look at the important stuff
MeshCalculate(counter, i); 
There will be a function called 'MeshCalculate' that takes in the index of the current vertex and the current row of that vertex (basically it's vertical offset, keep in mind this is in object space and therefore vertical does not actually mean up-down in world space, but in model/object space it is) and performs a certain set of actions which will modify the vertices of the mesh in some way.
mesh.RecalculateBounds();  
Recaculate bounds takes up the entire set of vertices and makes a box which encloses all those points.
We have to do this inorder to make sure the bounding volume is correct.
collider.sharedMesh = null;  
collider.sharedMesh = mesh;  
sharedMesh takes the 'mesh' object used for collision detection.
we are just setting it to the updated mesh after setting the previous one to null ( not necessary ) to help with garbage collection.
void MeshCalculate(int vertexIndex, int yOffset)  
{  
  if(generateContinuously)  
  {        
     vertices[vertexIndex].z = Mathf.PerlinNoise  
         (Time.time + (vertices[vertexIndex].x + transform.position.x) / detailScaling,  
          Time.time + (vertices[vertexIndex].y + transform.position.y)) * yScaling;  
       vertices[vertexIndex].z -= yOffset;  
  }  
  else  
  {  
     vertices[vertexIndex].z = Mathf.PerlinNoise  
        ((vertices[vertexIndex].x + transform.position.x) / detailScaling,  
        (vertices[vertexIndex].y + transform.position.y)) * yScaling;  
     vertices[vertexIndex].z -= yOffset;  
  }  
}  
This function basically uses Perlin Noise to generate those up and down waves/hills.
vertices[vertexIndex].z -= yOffset;  
we have this line to prevent all the vertices from bunching up and just forming a single line of vertices.
if(generateContinuously)  
{        
   vertices[vertexIndex].z = Mathf.PerlinNoise  
       (Time.time + (vertices[vertexIndex].x + transform.position.x) / detailScaling,  
        Time.time + (vertices[vertexIndex].y + transform.position.y)) * yScaling;  
   vertices[vertexIndex].z -= yOffset;  
}  
If the 'generateContinuosly' bool is true then it uses the Time.time value to give the seed value to the perlin noise function.
If  'generateContinuosly' is false the only factor acting as the seed to the perlin noise is the position of each individual vertex as well as the position of the object in world space.
Source code for Part 1 is available HERE.
Goto Part 2 for the rest of the tutorial, source code and unitypackage (contains everything).😀.
For more Unity development tutorials go HERE.
Support Bitshift Programmer by leaving a like on Bitshift Programmer Facebook Page and be updated as soon as there is a new blog post.
If you have any questions that you might have about shaders or unity development in general don't be shy and leave a message on my facebook page or down in the comments.

Comments

Assets Worth Checking Out

POPULAR POSTS

Shader Optimization Part 1

The process of shader optimization can seem like trial and error... in fact, that's how it is most of the time.
Most of the time shader optimizations could be boiled down to educated guesses because each time a shader gets compiled, the GPU driver of that specific hardware is what converts your code into actual machine code, therefore, the machine code generated will be different for each GPU and the driver itself might perform some optimizations on top of your's which won't be available on another GPU, thereby making it difficult to have a standard way of writing optimal shader code.

So the best way to know for sure is to actually test it on the hardware you are targeting.
With that said, Here are some universal best ways of getting your shader to perform better.😅 Do Calculations On Vertex Shader The most commonly used case for this is lighting, an example would be Gouraud lighting, where lighting calculations are done per vertex but at the loss of quality.

Some calculatio…

Toon Liquid Shader - Unity Shader

Toon Liquid Shader This is how the shader will end up looking :
This shader is pretty neat and somewhat easy to implement as well as to understand. Since we will be adding some basic physics to the toon water as it is moved about we will have to support that in the vertex shader as well.
So let's start by looking at the properties :
Properties { _Colour ("Colour", Color) = (1,1,1,1) _FillAmount ("Fill Amount", Range(-10,10)) = 0.0 [HideInInspector] _WobbleX ("WobbleX", Range(-1,1)) = 0.0 [HideInInspector] _WobbleZ ("WobbleZ", Range(-1,1)) = 0.0 _TopColor ("Top Color", Color) = (1,1,1,1) _FoamColor ("Foam Line Color", Color) = (1,1,1,1) _Rim ("Foam Line Width", Range(0,0.1)) = 0.0 _RimColor ("Rim Color", Color) = (1,1,1,1) _RimPower ("Rim Power", Range(0,10)) = 0.0 } Just the usual stuff that we are used to. The only thing that may stand out is the [HideInInspector] tag, This works j…

Gift Wrapping Convex Hull Algorithm With Unity Implementation

Convex Hull Algorithm Convex Hull algorithms are one of those algorithms that keep popping up from time to time in seemingly unrelated fields from big data to image processing to collision detection in physics engines, It seems to be all over the place. Why should you care? Cus you can do magic with it and it seems so simple to implement when you first hear about it, but when you start thinking about it, you will realize why it's not such a straightforward thing to do.
Now that I got you interested (hopefully) and now we will see just what a convex hull is.
As you may have noticed a perimeter was made with the same points that was given and these perimeter points enclose the entire set of points.
Now we have to clear up the term 'Convex'.
Convex means no part of the object is caved inwards or that none of the internal angles made by the points exceed 180 degrees.
In this example of a concave shape internal angles go beyond 180 degrees.
What are those red lines for? Well...…

Pixelation Shader - Unity Shader

Pixelation Shader This is the correct way (one of many) of showing pixelation as a post-processing effect. This effect will work in any aspect ratio without any pixel size scaling issues as well as it is very minimal in terms of coding it up.

In order to get this to work 2 components have to be set up:
1) The pixelation image effect
2) The script - which will be attached to the camera

So let's get started by creating a new image effect shader.
We will take a look at our Shaderlab properties :
_MainTex("Texture", 2D) = "white" {} That's it, Everything else will be private and not shown in the editor.
Now we will see what are defined along with the _MainTex but are private.
sampler2D _MainTex; int _PixelDensity; float2 _AspectRatioMultiplier; We will pass _PixelDensity & _AspectRatioMultiplier values from the script.
As this is an image effect there is no need to play around with the vertex shader.
Let's take a look at our fragment shader:
fixed4 frag (…