Toon And Cell Shader
- jamesghholt
- Mar 30, 2022
- 5 min read
Updated: Jun 21, 2022
Usually Toon Shading is defined as light limited to lesser gradients, tints and shades to create a flat "cartoon" look. Cell shading are outlines, and sometimes inner contour lines, used to mimic a drawn comic style. These two shaders go hand in hand.

We initially wanted to create this texture to further the stylisation of our scene and pander to the texturing. Our initial inspiration is this scene created by Althea Krebelj:

This style has a very slight outline, but nothing too harsh, something we wanted to mimic. To take it a step further I worked in complementary toon shader alongside this.
Usually games are lit through "vertex lighting", for older games this is done through ""Gouraud Shading"- as such the example below. More modern games may default to variations of Phong Shading alternatively, which is per-fragment/ per-pixel, but lets stick to per-vertex for now. Gouraud shading, which interpolates colours across polygons, per vertex normal.

The intensity of lit surfaces are calculated in a similar fashion as "Geometrical Optics", which describes light propagation in terms of rays. The model usually used for depicting reflections and refraction. If Vertex normals are closer angled to the light source, they will receive a greater value of light between a range of 0-255 (0 being furthest away from the light source [dark], and 255 being facing the light source directly [lit]).

This 0-255 value links to a LUT (Look Up Texture), 255 pixels wide, usually a gradient from black to white-- in the case of vertex lighting. If a vertex is determined to have a value of 255 (vertex normal directly angled towards light source), the colour on the furthest right will be sourced-- White. Giving the appearance of diffuse light interacting with said asset. The opposite applies for a vertex with the value of 0.

Toon shading essentially manipulates this LUT. As seen above, "ZAtoon" is just two colours, black and white. A slight gradient resides between, to ease the switch between both values. Limiting vertices, meaning they can only source one or the other, creating the harshly lit toon effect.

This LUT manipulated even further. Colours apart from black and white can be introduced. Which can dramatically change the style and mood of your game.

In the case of Wind Waker, various LUT's are available for times of day and changes in weather. This effect looks incredible, unfortunately it's not exactly applicable to our game, as weather wouldn't change too dramatically in a minute or so. However, we could've worked in randomised weather/ time of day per match.

Beginning the development of a Toon Shader in Unreal Engine, I worked in a rudimentary shader, which had plentiful room for improvement. It samples light information; if the lighting intensity is below or above a specific value, it will be replaced with the suiting "tint"-- light or dark. These tints can be controlled, as they are 3vector parameters (RGB). Essentially this shader mimics the previous "ZAtoon", but allows for dynamic parameters in engine- such as bias control.
A distance mask is in place, for most shaders, to prevent and shaders interacting with the skybox. Causing obvious artefacts. A "Custom Depth Stencil" is used to mark which objects should be affected by said shader, if any unfortunate shading issues occur.

As assumed, this shader only allows for two extremities, as seen below. No gradient between steps/ cuts and a lack of said steps/ cuts.

If it's unclear, this is what I mean by steps/ cuts:

To combat these limitations I worked towards the development of a Toon Shader influenced by an external LUT. Removing the benefit of dynamic parameters, but the customisability we desired; multiple steps, colour and greater bias control.
Another alternative would be using material based toon shading, however this is extremely taxing on performance and every material would need to be configured. For shading an entire game, this is an extremely ineffective method, for small unique assets it can play a great part -- for example, the Powerup Pickup Shader, which you can find elsewhere on my blog.

Here are some example LUTs, we have used:

Below it "T_Blue_LUT", as you can see it creates 3 steps, and tints the scene a more moody blue. LUTs allow us to completely reimagine our scene, through one texture input.

Here some examples, of various LUT's effect, on our entire scene:



Using LUT's did come with some complications, mainly the conflict with emissive materials. The LUT would only be comfortable displaying Albedo alone. However, a small switch to additive materials fixed this. Below, from left to right, is the fire with emissive, with the LUT shader applied and with the fire Albedo with the LUT applied also.

Cell Shading can be achieved in many ways, depending on the need of the asset, or the need of the game as an entirety -- you may sway towards different methods. A frequent method, for those not versed in shaders, materials or post processing; is the inverse hull method. Which takes advantage of back face culling. A process which hides polygons orientated away from the viewer, through winding, or the comparison of face normals and the viewer. By creating a shell and inverting it's normals around your chosen asset, a cell shaded effect can be quickly achieved.

This is usually used for quick static renders or extremely low poly characters. As the mesh in it's entirety must be duplicated to create an accurate silhouette, essentially doubling your tri count. Plus for animated meshes, such as a character, the outer shell can frequently clip into the character. Furthermore, this is created within the users 3D package, no further tweaks can be made within engine once implemented. This low tech method doesn't suit games at all, but static, quick, renders.
On the other end of the stick, is the most complicated method; Custom Shader (USF [Unreal Shader Files]). Writing HLSL (High Level Shader Language [DirectX]) externally and importing it into Unreal. This method is extremely optimised, to the point where mobile and VR limitations can be overcome. However, this method is extremely high tech and outside of my capabilities currently.
The method I used is the happy medium of both. Through Unreal Engines Post Processing. All done within Unreal using node graphs, it's extremely flexible and easy to tweak in engine. It may not be as performance efficient as USF, but its much better than using the Inverse Hull method. Plus, performance is predictable and measurable. There are mobile and VR limitations, fortunately we have no plans to port our game to such platforms.
To break down this shader to it's bare fundamentals, Sobel operations are used to compare pixels, to their surrounding pixels offset. If a difference in value is identified lines are created. The depth of assets within the scene are found through the "SceneDepth" node, here's an example of how the node functions:

As you can see, depending on the depth of the asset within the scene, in relation to the camera, it is assigned a different value.
Below is the full node graph. The five rows inside the annotated "Depth Lines" comment sample the; left, right, below and above pixel in relation to the centre pixel from the scene depth.

This sampling alone creates this scene:

This alpha is used as the Cell Shading, and can be tinted through a 3vector parameter. The scene itself is reintroduced, so the player doesn't just see the scene above. Other parameters are available, such as line bias and thickness. Unfortunately, the thickness is dependent on amount of pixels available. So the further an asset sits away from the viewer, the thicker it's outline will be, due to the lack of pixels available.
Inner lines, influenced by normals can also be defined. Using a similar method as scene depth, but world normals. However, our game uses only Albedo, so this has no purpose within our game.

I am very happy with the final product, and learned a lot throughout this process.
References
Commenti