Using Perlin Noise To Generate Terrain In Unity

It’s fairly common to find Perlin Noise being used to generate procedural terrain, among other things. Perlin Noise can be used to generate a heightmap, which in turn can be used to control the height of a chunk of terrain… and it would then be common to layer those heightmaps at differing frequencies to create the sort of surface detail that you would associate with real world terrain.

A single Perlin Noise heightmap might look like this.

I’m going to describe a quick test here where I’ll build a grid of terrain cells in Unity using some custom shaders to make the rendering a bit more interesting. What I’m going to end up with is a simple interface defined via a C# script, that in the Inspector looks something like this, that will allow me to quickly build and edit the properties of my terrain.

The way I’ve set this up is such that clicking the Build button cleans out all children of the game-object that the script is attached to, and then generates a set of new children representing terrain tiles and the water. The surface area of the terrain is defined by the tile count and tile size.

For each tile I’ll generate six sets of Perlin Noise, using different frequencies (input scales) and weights, which I’ll then sum and normalize, before raising to a power and finally scaling to fit the desired world height. We raise to a power in order to flatten out the lower terrain while leaving the taller features more or less untouched… which I find yields more interesting results. Once we have out heights we then create polygon meshes per tile to represent the terrain.

The important part of all this is that deep inside the inner loop of the script we do the following per vertex calculation, where the s and w values are taken direct from the UI we’ve created.

       // sum the weighted heights.
       float y = 0.0f;
       y += Mathf.PerlinNoise(x / s0, z / s0) * w0;
       y += Mathf.PerlinNoise(x / s1, z / s1) * w1;
       y += Mathf.PerlinNoise(x / s2, z / s2) * w2;
       y += Mathf.PerlinNoise(x / s3, z / s3) * w3;
       y += Mathf.PerlinNoise(x / s4, z / s4) * w4;
       y += Mathf.PerlinNoise(x / s5, z / s5) * w5;
       y /= weightSum;

       // raising the height to a power gives a much nicer result
       // - more flat areas... recognizable mountains... etc
       // also factor in a height scale.
       y = Mathf.Pow(y, WorldHeightPower);
       y *= WorldHeightScale;

With those meshes built and attached to a MeshFilter component (and a MeshCollider too) I’ll then link them to a custom shader. Assuming I’ve a shader in my asset library this is easy. I just need to add a MeshRenderer component and link it to my shader like so, where since I’ve based my shader on the standard Unity surface shader I also need to enable to metallic workflow and set the glossiness to prevent the terrain looking like shiny plastic.

      meshRenderer.sharedMaterial = new Material(Shader.Find("Custom/GOAT_Terrain"));
      meshRenderer.sharedMaterial.EnableKeyword("_SPECGLOSSMAP");
      meshRenderer.sharedMaterial.SetFloat("_Glossiness", 0.1f);

My custom shader samples two textures, one for grass and one for cliffs and I also provide a scale for each. Whenever these are changed via the inspector I can easily connect the new values to my components by doing the following for each terrain tile, bearing in mind that I’ve named that _GrassTex and _CliffTex in my shader code.

      meshRenderer.sharedMaterial.SetTexture("_GrassTex", GrassTexture);
      meshRenderer.sharedMaterial.SetTexture("_CliffTex", CliffTexture);
      meshRenderer.sharedMaterial.SetTextureScale("_GrassTex", new Vector2(GrassScale, GrassScale));
      meshRenderer.sharedMaterial.SetTextureScale("_CliffTex", new Vector2(CliffScale, CliffScale)); 

Finally… my shader uses the following code in place of the usual code that samples _MainTex, allowing for the two textures to be sampled and blended based on the surface normal, meaning you automatically get the grass texture on the flatter areas and the cliff texture otherwise…

      float lerpFactor = saturate(saturate(o.Normal.y - 0.8f) * 8.5f);
      fixed4 grassColor = tex2D(_GrassTex, IN.uv_GrassTex);
      fixed4 cliffColor = tex2D(_CliffTex, IN.uv_CliffTex);
      fixed4 textureColor = grassColor * lerpFactor + cliffColor * (1.0f - lerpFactor);
      o.Albedo = textureColor.rgb * IN.color.rgb;

The end result looks something like this…

The result thus far isn’t really what I was after. I was hoping to create something a little more stylized with more interesting and not necessarily natural features, and I’m not sure this is really taking me in the right direction, so while I could carry on and try to make this look super realistic, I’m going to leave this experiment there and look into other approaches.

Peter Panning Shadows In Unity

I’d like to find the cause of some problems I’ve seen with shadow casting lights in Unity and in order to do so I’d like to be able to edit and generally experiment with changes to the standard deferred shaders. It turns out you can do just that!

For reference this is the problem I want to address:

First step is to download built in shaders from here

Extract Internal-DeferredShading.shader and rename it. I’ve called it GOAT-DeferredShading.shader… just because I like goats… but you can call it anything. From there import the new shader into the project under Resources, and under the scenes Graphics Settings change Deferred to Custom Shader and point it at the imported shader.

Now we can try a modification to double check it’s all hooked up. Any modification to CalculateLight should be immediately visible in the game view. Try faking the GBuffer inputs or something similar and you should see the result right away if the new shader is working.

I’m trying to find the cause of some shadow caster Peter Panning problems so in order to ensure the relevant code is available I also import UnityDeferredLibrary and UnityShadowLibrary, renaming them and then replacing the includes to chain the files together such that the custom versions of the code are used instead of the original versions.

Almost right away I spot this code in UnityShadowLibrary.cginc:

inline half UnitySampleShadowmap (float3 vec)
{
    float mydist = length(vec) * _LightPositionRange.w;
    mydist *= 0.97; // bias

Hard coded numbers always make me suspicious so I try eliminating it, and this is the result.

This image shows shadow acne. The cause is where the depth of the occluder is more or less the same as the depth of the receiver, and the precision of the depth buffer isn’t enough to resolve the depth difference, so we end up with pixels incorrectly marked as in shadow. In effect the surface here is casting a shadow onto itself. The standard way of dealing with this would be to default to casting shadows from back faces only… where from the point of view of a light the back faces are never visible and are therefore always in shadow and are almost always deep enough to not interfere with the front facing polygons shadow calculations. Is this not how shadows work in Unity?

Changing this hard coded bias does however appear to fix the peter panning problems… it just causes lots of other problems at the same time.

If I’m going to fix this properly I need to be able to control the shaders that write to the shadow map. It turns out you can do that too. Unity comes with a shader called Standard.shader that is used by the standard materials. As before we can extract this, rename to GOAT-Standard.shader and import it. Then we apply this small modification to switch from front face to back face shadow casters…

     // ------------------------------------------------------------------
     //  Shadow rendering pass
     Pass {
        Name "ShadowCaster"
        Tags { "LightMode" = "ShadowCaster" }
        ZWrite On ZTest LEqual		
        Cull Front  // <---------------------------

At the same time we change that we can comment out the line that multiplies the distance by 0.97… and as if by magic everything works more or less the way I wanted it to. With the bias removed a small amount of flickering is visible where the pillar joins the floor, but with this setup a tiny negative bias is actually more appropriate, so I end up with code that looks like this instead

inline half UnitySampleShadowmap (float3 vec)
{
    float mydist = length(vec) * _LightPositionRange.w;
    mydist *= 1.01; // bias

And now the shadows are totally solid, and match the geometry perfectly!

First Steps in Unity

I’ve installed Unity (2017.1) for the first time… and want to get to grips with some of the basics, so I’m going to build a small scene with a few moving parts covering some of the things I expect to be important to me.

When you open Unity you start with a default setup that provides a few things out of the box. Namely a camera, directional light, and some default render settings. This is what you are presented with when you first open Unity.

Out of the box, what you don’t get, is access to a deferred rendering pipeline. I want one of these as I think deferred lighting, bloom, decent reflections, etc, are all essential components, so the first thing I need to do is install the Post Processing Stack, described here. Once you’ve got that installed you create a post processing profile, which describes the precise pipeline you want to use and add it to the camera via a Post Processing Behavior component.

Next, Unity adds a lot of ambient light and reflections by default. I don’t want any of these as I want tight control over the lighting. Basically, if I don’t add any lights, that means I want everything to be black, not some default ambient color that’s it’s chosen to use for me. I understand why they do this… because an out of the box experience where everything is black until you add some lights might be confusing for some people… but I want to get rid of this light source. To do that you need to go to the lighting settings and set the Environment Lighting and Environment Reflections to both be sourced from a color… and then make that color black.

Having done all that we still need to activate the deferred pipeline. To do that you go to the project graphics settings, under Edit -> Project Settings -> Graphics and switch all tiers to deferred.

Finally, I’ve turned off the sky too, which you can do by setting the clear color of the camera to black. I’ll add a sky again as and when I think I need one. For now it looks odd that I have so little light in the scene and then this bright sky sitting on top of everything, so I’m better off without it.

Unless I’ve missed a step there, you should now be able to drop in a shadow casting point light and a few cubes and get something similar to this…

Next up, I want to add a few textures. I have some normal and albedo maps lying around from previous experiments so I grab a few of those and drop them in as assets. For the normal map you need to open it up and set it’s type to Normal Map. Then you can create a material and connect them as Albedo and Normal Map under Main Maps. Finally select the floor and map it’s material to the new material that we just created. In my case I’ve also had to set the tiling on the material to 15 to make the maps fit the large floor I’m using.

Now I want to animate a few things, which seems like an excuse to try adding a C# script to my scene. I start by making a new script in the asset library, then I open it, which should open VS2017 Community Edition. From there you are presented with methods called Start and Update to which you can add code. I type this.

using System.Collections;
using System.Collections.Generic;
using UnityEngine;

public class NewBehaviourScript : MonoBehaviour {

    public Vector3 initialPosition;
    public float time;

    void Start () {
        initialPosition = transform.position;
        time = 0.0f;
    }
	
	void Update () {
        time += Time.deltaTime;
        transform.position = initialPosition;
        transform.Translate(Vector3.up * Mathf.Cos(time * 5.0f));
        transform.Translate(Vector3.left * Mathf.Sin(time) * 2.0f);
    }

}

Once saved I drag the script into the space below the existing components on each light which automatically creates a new script component for each, and when you click play now the lights slide back and forth in the X axis and bob up and down in Y.

There are a few things here I’m not happy with still. The first is that the shadows cast from the column exhibit whats known as Peter-Panning, where there is a disconnect between the caster and the shadow itself making it look as though the object is not quite sat on the floor. I’ve checked and the object actually intersects the floor so there is no gap at all. Unity exposes a per light shadow bias setting that seems to control this, but it appears that a bias of 0 doesn’t really mean 0 and so we still end up with a gap. I believe we should be able to get away with a much smaller bias and shouldn’t need to put up with this artifact, but I can’t see an obvious way to achieve that…

Other things I’ve struggled with a little are the points lights, where I seem to have to make them very intense to get enough light in the scene, but doing so seems to create very intense lighting some distance from the object still, which doesn’t feel realistic. I need to do some experiments as I’m not sure if this is down to the bloom settings, the tone-mapper or eye adaption, or more likely the light falloff, or some other thing… but when I’ve previously coded everything myself it seemed much easier to balance the lighting and get a more natural look to things.

I’ve a feeling that my next set of experiments might revolve around writing custom shaders for Unity.

Downloading Historical Stock Data With Pandas

This is a short post quickly outlining the Python module Pandas, which has been a great find. Pandas make the manipulation of labelled multi-dimensional data very easy indeed allowing operations that might be more familiar to users of a typical spreadsheet app to be coded up in Python in no time.

Installation of Pandas and the associated Pandas DataReader modules is a simple as this:

python -m pip install pandas
python -m pip install pandas-datareader

I’m not going to cover details of the core Pandas feature set, but rather wanted to quickly show how the DataReader modules makes the automatic collection of historical stock data very easy indeed… so for example this coder snippet shows how to download closing prices for Microsoft stock over a roughly three year time-frame, and save it to a CSV file.

import pandas_datareader.data as pdr_data
import datetime
start = datetime.datetime(2010, 1, 1)
end = datetime.datetime(2013, 1, 27)
data = pdr_data.DataReader('MSFT', 'google', start, end)
data.to_csv("MSFT.csv")

TensorFlow Setup For Windows 10

This is more a summary of my own experience getting Tensorflow up and running on my own Windows 10 PC, rather than a complete setup guide, but might help people figure out solutions to the few of the problems I encountered along the way, including myself in future if I do this again on a fresh PC.

Before starting I’d advise reviewing the documentation for installing tenorflow here and possibly the beginners user guide here too.

Step 1 is to install the NVidia CUDA Toolkit, and NVidia CudaDNN. I already had CUDA v8.0 installed which is exactly what we need so we are good so far, but I didn’t have the DNN lib installed. My first reaction was to download and try to setup v6.0 but I later found that v5.1 was needed, so instead you need to download that version specifically. The official setup guide requests that you put the DNN lib somewhere and then add that location to the PATH envionment variable. I’ll come back to that later.

I’m planning to use Tensorflow from Python. I had Python 3.5.1 installed. For some reason the available tensorflow module wasn’t happy about that and I had to downgrade to Python 3.5.0 before it would install. Old versions of python are here. To complete the downgrade I also had to change the path environment variable to ensure the new (older) version was used in preference to the newer one.

The python version can be easily checked by doing this.

python --version
Python 3.5.0

I also had to upgrade the version of PIP before it would install, which can be done by doing this.

python -m pip install --upgrade pip

Finally I was able to install the package, where you can choose either of the following for either a CPU or GPU implementation. I chose GPU.

python -m pip install --upgrade tensorflow
python -m pip install --upgrade tensorflow-gpu

Having done all that, it still wouldn’t work for me, complaining about missing DLL’s now ?!

Quite a few forums suggest that the errors meant I was missing “Visual C++ 2015 Redistributable Update 3”. I am missing that as it happens, so I spent some time trying to install it. Eventually I realized that the Visual C++ 2017 version is a binary compatible in-place upgrade to the Visual C++ 2015 version, and so what I had installed should already should meet the requirements of TensorFlow. This wasn’t the problem.

Fixing this problem actually involved me ignoring the CUDA installation instructions, and so instead of using PATH to locate the DNN lib, I found it only worked when I merged the files from DNN into the CUDA installation itself.

Having done that I can run the sample mnist_softmax.py from the beginners guide without error!

I’m new to this, but the first thing I did was to modify the sample so the network looked a bit more like the networks I’ve built by hand (in C++) to learn to read the MNIST samples, basically adding a hidden layer with 100 neurons and randomizing the weights, so the code near the top ends up looking like this.

x = tf.placeholder(tf.float32, [None, 784])
W1 = tf.Variable(tf.random_normal([784, 100], stddev=0.01))
W2 = tf.Variable(tf.random_normal([100, 10], stddev=0.01))
b1 = tf.Variable(tf.zeros([100]))
b2 = tf.Variable(tf.zeros([10]))
y = tf.matmul(x, W1) + b1
y = tf.nn.relu(y)
y = tf.matmul(y, W2) + b2

Alongside that I also increased the number of batches to be processed to 20k, which get’s me a score of 98.0%!

UPDATE: Trying to run some of the other samples this morning I’m getting crashes in the Python process coupled with error messages relating to cupti64_80.dll. I eventually figure out that this DLL comes with the CUDA Toolkit and that I already have it installed under CUDA\v8.0\extras\CUPTI\libx64, though that location is not in the PATH. Adding it to the PATH resolves this problem.