top of page
Search

Various VFX

  • jamesghholt
  • Mar 30, 2022
  • 6 min read

Updated: Jun 13, 2022

Types of VFX

  • Particle Systems - Only uses particles, no meshes or shaders.

  • Mesh Effects - Use of 3D models, can include bones and animation. UV’s extremely important in their application.

  • Flipbook/ Sprite Sheets - Multiple frames that, together, create an animation. Consists of simulated physics, or hand painted work. Useful for fire, liquid and smoke VFX. Requires knowledge of external software.

  • Shaders - Advanced topic, can be applied to a multitude of VFX.

  • Hybrids - VFX which combine two or more of the previously listed types.

Principles of VFX

  • Gameplay - Purpose of VFX in game setting.

  • Timing

  • Anticipation: Creates an expectation so the player can react.

  • Climax

  • Dissipation

  • Shape - Player needs to understand the shape (use easily recognisable shapes) and size of the VFX’s area of effect. Furthermore, take into account shape language, in relation to the VFX’s effect on the player.

  • Contrast - Creating a focal point for the player’s attention. (Higher contrast= higher focus. Lower contrast= lower focus) e.g. the tip of a damaging projectile. Contrast can be used to show intensity of the VFX’s effect within the game, such as; the difference between basic attacks and an ultimate ability


  • Colour/ Theme - Colour palettes alone doesn’t always convey the VFX, since one palette alone could be applied to a great multitude of contrasting VFX e.g. Green for poison or healing. Applying relevant elements and the previous principles alongside the VFX is what truly conveys the VFX’s purpose to the player. However, this isn’t to say colour palette should be disregarded, just be careful with how it’s applied.


Pre-Production

  • Gameplay Objective

  • Getting References

  • Technical Research

  • Sketching


VFX Systems: Niagara VS Cascade

Cascade is older and has hard limitations on what it can do and mostly uses CPU. Whereas, Niagara technically has no limitations and uses GPU directly. You can get access to every parameter of a single particle and extend it if needed.


In the Niagara VFX system, there are four core components:

  • Systems

  • Emitters

  • Modules

  • Parameters

Systems: containers for multiple emitters, all combined into one effect. For example for a firework effect, multiple emitters can be used, to create multiple bursts. Using the "Timeline" panel, emitters that are contained within the system can be quickly managed.


Emitters: Containers for modules. Single purpose, but reusable. One unique thing about Niagara emitters is that you can create a simulation using the module stack, and then render that simulation multiple ways in the same emitter.


Modules: Niagara modules are the base level of Niagara VFX. Modules are the equivalent of Cascade's behaviours. Modules speak to common data, encapsulate behaviours, stack with other modules, and write functions. Modules are built using High-Level Shading Language (HLSL), but can be built visually in a Graph using nodes.


Parameters: an abstraction of data in a Niagara simulation. Parameter types are assigned to a parameter to define the data that parameter represents.


The Emitter node is broken down into six more components, allowing in-depth customisation:

  • Emitter Spawn

  • Emitter Update

  • Particle Spawn

  • Particle Update

  • Add Event Handler

  • Render

Emitter Spawn: Only updates once at the beginning of the Niagara emitters lifetime. Emitter Update: updates the emitter every frame, responsible for the spawning of particles.


Particle Spawn: Where the majority of modules reside and where our attention will reside. Occurs once per created particle, and controls the spawning of said particles. Whereas, Particle Update: occurs every frame per particle. Used for; gravity, scaling colour/ alpha, forces and age-- for example.


In many cases, you will want several emitters in one system to interact with each other to create the effect you want. This usually means that one emitter generates some data, and then other emitters listen for that data, and perform some behaviour in reaction to that data. In Niagara, this is done using Events and Event Handlers. Events are the modules that generate specific events that occur in the lifetime of a particle. Event Handlers are modules that listen for those generated events, and then initiate a behaviour in response to that event.



Render: describes how Unreal Engine should display each spawned particle. Note that this does not have to be visual. Unlike modules, the placement of the renderer in the stack is not necessarily relevant to draw order. Five types of renderers are currently supported: Component, Light, Mesh, Ribbon and Sprite.


Below is a quick compilation of various VFX:


Usually, for particles, only alphas are used-- colour is defined through "particle colour" within Niagara. Not externally through texture. Also, a common technique is to use values outside of the 0-1 range (RGB), Unreal reads this as emissive textures -- which is frequently used within the VFX world. Below is the stylised fire VFX, which takes full advantage of alphas through channel packing. The full flame texture has an individual wisp of flame on each individual channel (RGB). Giving access to flame variation, which can be individually manipulated through dynamic parameters within Niagara.


Fire is extremely sporadic, to emulate such energy, a complex noise is used. A channel packed tiling Perlin noise to be exact; it's UV's are panned to reinforce the constant, shifting moving of fire.

An alternative flame texture I developed in Substance Designer below can be seen. I decided to not use this method, as I don't think the finalised look would match our style accurately.

Within Niagara this fire has a plethora of "User Parameters". Parameters which can be easily accessed in engine, which can be swiftly changed by artists, through simple sliders. For example: colour gradient, spawn rate, spread, exposure, max/ min flame size, lifetime. We have a simple fire alpha, but Niagara allows me to engrain it's behaviour to match. Below you can see such module, and quickly gather, through their names alone, what purpose each one serves. Each module is further broken down into custom parameters, commonly being "Random Range Float", and custom curves.

On the right are some sparks to accompany the flames. Extremely similar to the flames, except they are able to collide and interact with objects. Sparks are extremely simple, all emitters within a system of sparks can be built from a blurred circle alpha. Below, is a good example. The source is a flash of said alpha, and the sparks themselves are the same alpha-- simply velocity aligned and warped. Gravity is tweaked to create their "floaty" nature. For fire, gravity is completely inversed -- allowing for flames to rise.


Below is the stylised smoke material. Extremely similar in concept to the fire material. Using Unreal Engine's supplied smoke texture, which is realistic, it can be "stylised" through distortion.

At a glance, you'll notice the smoke is a flipbook texture, a UV is made up of Sub-UVs. Through referencing a "SubImageIndex" within Niagara, this animation can be played in it's entirety- over each particles life span. The Niagara modules are practically identical to fire, except smoke velocity particle spawn is bound to a cone.


The "dizzy" and "frozen" effect are extremely simplistic, and heavily focus on their implementation through Niagara. Both materials are simple alphas, with particle colour, to be later controlled through Niagara. Snowflakes spawn within a sphere, and have slight curl and drag, as if they are slowly drifting through the cold air. The stars spawn with a set probability, and rotate around a set point, defined within local space.

The shield effect is a tad more complex, as it requires mesh effects. I modelled a simple shield and rotated multiple around the character, using local space, so it tracks to the characters movement. Fresnel with opacity is used to add some interesting visual effect -- and that's the shield in it's entirety.

The ball trail was implemented as projectiles, your own and the oppositions, were hard to track. I tried using the fire VFX as a trail, but this was extremely noisy. A simple trail created through "Ribbon" rendering fit the role perfectly. This trail is practically base settings, with a few tweaks-- such as, width being manipulated along it's length.

Below is an example of it on action:


Players needed a way to identify hit feedback. We already have a crosshair and barrier movement with a progress bar. However, I created a VFX which activates once the projectile collides with the player. See it below:

The ring and "star" are packed within the same texture. Two dynamic parameters are present to create the eroding appearance. Achieved through manipulating power exponent over time, and the subtraction of a noise over time. The final result required a fair bit of fine tuning through graphs, and I'm happy with the final result.


I would of hoped to work on more complicated hybrid effects, however our game's simplicity called for simple effects. Effects similar to that of League Of Legends would be extremely overzealous and obnoxious, when in a setting as our game. At least my effects effectively confer to the player, the purpose they serve.

References




 
 
 

Comments


©2022 by James Holt. Proudly created with Wix.com

bottom of page