Course overview Course overview
Learn dynamic effects for games
In this course, students learn the fundamental workflows and approaches for creating a wide range of dynamic effects for real-time game environments. Students will learn essential methods for efficiency and optimization of visual effects to be used in a video game. It will feature different types of effects for different markets: first-person shooters, 3D action games, MOBAs, mobile games, etc.
FX for Games WHAT YOU’LL LEARN
The more you know, the better.
Igniting your imagination
Fabio Silva is an award winning Senior VFX artist living in Los Angeles. He was the winner of the VES AWARDS for Outstanding Real-Time Visuals in a Video Game, and nominated for the same category one other time. He has worked at Naughty Dog, in THE LAST OF US - Part II, Uncharted - The Lost Legacy, and Uncharted 4 - Thief's End for Playstation4. He has also worked on titles such as Ryse: Son of Rome for Xbox One and PC,Killzone 3, and Killzone Shadow Fall for the PS4. He now works at Blizzard Entertainment, as a Sr. 3D VFX artist.
FX for Games Student gallery
winter TERM Registration
Oct 21, 2019 - Feb 3, 2020
Companies that hire our students
environment design Benefits
What makes this learning experience unique?
Receive personal individual feedback on all submitted assignments from the industries best artist.
1+ Year Access
Enjoy over 365 days of full course access. This includes all lectures, feedback, and Live Q&A recordings.
Certificate of Completion
Earn a Certificate of Completion when you complete and turn in 80% of course assignments.
Learn anywhere, anytime, and at your own pace with our online courses.
Speak to an advisor
Need guidance or course recommendations? Let us help!
Show us your skills
Not sure if you have the skills, or are you proving you do? Show us.
Portal VFX Production
Interview with Antonio Cappiello
Hi, I’m Antonio Cappiello from Milan, Italy. I currently work at Forge Reply S.R.L. as a VFX Artist, working mostly with Houdini, Substance Designer and Unreal Engine 4 – most recently on the VR game, Theseus.
I’ve been passionate about video games since my first Commodore 64, so it’s because of this passion that I decided to dig deeper and start studying the world of 3D, especially the area of visual effects.
This specific effect is composed of three meshes: a hexagon sphere, a basic sphere, and a ring circle, each with a specific UV set. I then generated the procedural textures in SD, thinking about the best and fastest way to achieve the final look and behavior that I had in mind. As you mentioned in the question, once I’ve got the right UV set for the meshes and a few noisy textures, it’s all about combining these elements together to make it look organic.
I started with a basic sphere polygon and then I used the “compute dual” parameter in the Divide SOP for generating the hexagons on the sphere surface. This is one of the best features about working procedural – sometimes you can get a complex shape with just two nodes and this is incredibly powerful in terms of saving time and reusing the network structure. After that, I used the Clip SOP node to change the sphere into a semi-sphere. The reason for this choice is simply because I didn’t need the other half of the sphere, so this way I optimize the mesh as well. Next, I built a simple For Each SOP where I extruded each hexagon to turn it into a 3D entity and I used the Peak SOP node to move the edges of the hexagons along their normals, so this way I could control the distance between each hexagon. The last but not least step I had to do was assigning the UVs to the hexagon semi-sphere. I know it could seem messy, but in order to build the final look that I wanted, this UV set was exactly what I needed.
For the energy barrier, I simply used a sphere which I clipped into a semi-sphere with the same method that I used before. Actually, it’s exactly the same sphere that I used without the Divide SOP node (without the hexagons on its surface), so as I said, it’s really efficient to be able to “go back in time” through the network and reuse the same structure without starting from scratch every time. That way most of the work is done in UE4 with texture panning, UV distortion, and masks.
I did the ring circle in Houdini as well. The most important thing to consider here is the UV set because it’s what determines the direction of movement from the center to the border of the mesh. This way when I panned the texture on it, I could simply use the UV coordinates to give it the illusion that “something” was moving away from the center. Next, I added the vertex color on the edges, in order to have a nice fading of the texture.
The last element is the rotating “locker” that you can see in the middle of the whole portal FX. I did the textures in SD and I packed them in the RGB channels – this way I just needed one file instead of three different ones and I could get access to each component through the UE4 material. The behavior was very easy; I rotated the UVs of the texture by a sine function with a period that changes randomly, this way it should be more interesting to watch.
Auto UV SOP node
For the hexagonal mesh which has that kind of a “glass” shader, the UVs were really important to give it that kind of look. To do that I used the “shortest path” method of the Auto UV SOP node. They seem very messy but actually, it was exactly what I needed for this. It’s basically an unlit translucent material and because it doesn’t have a reflection, the trick was to pan the same texture on top of itself using a different speed on both of the UVs’ coordinates and then use the Hue Shift node and a color node to get a variety of tints. The texture has a caustic pattern to emphasize the highlights while it’s panning along the mesh.
FX as a Particle System
In this case, I spawned the whole FX as a particle system. I usually expose the parameters that I want to tweak rapidly. I control the texture animations and blending by the “dynamic parameter” of the material which allows the “communication” of the data I need between the shader and Cascade. This way I can change the animation and the behavior of the textures (like the speed, distortion, intensity etc.) by drawing some animation curves in the particle system.
For the glass element, I built a translucent unlit material which contains a few noisy textures that pan with a different speed, plus I added some UVs distortion to make the movement more “dynamic”. I used the “Sphere Mask” node in the UE4 material shader to generate a nice “fluid” masking look when textures pan on top of each other, to achieve that kind of “energy” field that runs on the surface. I also used the “emissive” input to control the texture color, of course, but also to generate a kind of rim light along the sphere’s edge using the Fresnel function. The combination of this material with the hexagon mesh (or grid mesh) should give the illusion that it’s all contained in this glassy energy shield.
FX Development & New Heights
Sometimes I picture effects in my mind and then I draw some rough sketches on a piece of paper or in Photoshop to get a better understanding of the elements that will be involved and what I’ll need in order to achieve the results that I have in mind. I usually think about the entire animation cycle and then draw the “frames” of it to get an idea of the behavior that it should have. I’m inspired by Japanese cartoon styles for the visual effects because they usually use really interesting shapes and they give such a cool “rhythm” to the animations. It’s definitely one of my main starting points and I’m continually trying to learn new techniques in the field. I have no idea where my style will go from here but I just keep trying new things and playing with both the techniques and the atmosphere in order to find something new in my work.
Advice for Learners
The first thing I’d advise is to watch a variety of anime and start studying the look and the animation of those effects and try to reproduce them in a game engine. Thinking about the books I’d recommend starting with Elemental Magic: The Art of Special Effects Animation by Joseph Gilland. It gives good insight into how to interpret the behavior of the natural world.
I also usually find many VFX examples on Pinterest, (just type “vfx game” in the search field and you will get tons of inspirational examples). And o,f course I’d recommend the RealTime VFX site which is an awesome VFX artist community.
Antonio Cappiello, VFX artist
Interview conducted by Kirill Tokarev
Interview with Victor Montero
Víctor Montero shared the experience of taking CGMA course FX for Games led by Fabio Silva and did a breakdown the environmental FX he worked on.
Hey! My name is Víctor Montero. I decided to move into games development after 2 years working in the bank sector as a computer engineer. Then I studied a master in design and video game development where I realized that I needed something more artistic in my life, even if I still loved coding. Currently, I am specializing in VFX/technical art.
I was impressed with the high-end results that can be made with Unreal Engine 4. In particular, all related to shaders programming, lighting and, especially, visual effects. I tried to study with some tutorials on YouTube, listening to dev talks, reading docs… but I was feeling that I needed something more than a bunch of disconnected lessons. I was able to follow those tutorials but it was really hard for me to create my own effects. At this point, I decided to enroll in CGMA FX for Games course. Fabio M. Silva is the instructor of the course, and I couldn’t be more glad about the decision I’d made. He always gives a really good and personalized feedback on each assignment during the course.
I was wondering how the whole process of FX creation works. As a VFX artist, you need to have a huge range of skills like modeling, unwrapping, texture creation, shaders, animation, code… My main goals were to improve my knowledge of the visual effects and understand in depth the pipeline, as well as improve my weaker skills.
The last assignment of the course was the production of all environmental effects for a level. It took 3 weeks to complete the task. Firstly, I analyzed the level and gathered references for what I wanted to create. Before, I put some place-holders on the level and adjust the light of the environment. Once I verified that everything was working as intended, I started to create all the effects.
I tried to create a quiet, desolated and sad mood. The original environment was inspired by an Italian village, with a warm light. I tried to transport it to somewhere in the north of Europe. The first step was to change the environment to misty weather, greyish sky and drizzling.
The atmosphere is mainly created with both fog and rain effects, plus with a huge change in the lighting. Rain is a really common effect in video games or movies because it is really easy to transmit feelings with it.
Note: The level has been created by Matima Studio, you can check it here.
The fog is created by two particle emitters. The first creates the dense fog, and the second one gives detail and movement to the effect.
For the dense fog, in the material shader, change the material domain to “Volume” and blend mode to “Additive”. I created a volume by a “Sphere mask” and multiplied by a grayscale texture to avoid fully spherical shapes. Then multiply by Dynamic Parameter to control the opacity in the particle emitter. Finally, the particle system feeds up with this material. I used the “Initial location” to set fog near the water.
The second emitter adds details of cloud shapes and movement to the effect. Here, I’ve used alpha erosion in order to achieve a smooth fading in and out of these clouds. To do that, I used a dynamic parameter and interpolated between [-1,0]. Then added this value to the opacity channel.
This emitter creates the effect of moving clouds faking animation of the fog with “Initial velocity”. For the alpha erosion, use the “Dynamic” module in the particle emitter. I wanted to control the visibility of the clouds during their lifetime, so I used the Curve editor inside Cascade to manage it.
As always, there are a lot of ways to face the creation of an effect. I just decided what was better – in my case, a big particle emitter for the raindrops and several small emitters for water splashes and ripples.
The rain effect itself doesn’t have anything fancy. It’s created by spawning several water drops that have been created with a simple dot texture and small distortion around. As there are too many elements, it becomes necessary to change the type data to “GPU Particles” in the particle emitter. This means that the calculation of this effect is handled by the graphics card instead of the CPU. This is mandatory in this kind of effects. To do that, just right click on the grey area of the particle emitter and go to Type Data/”New GPU Sprites”.
For the impact, I created a splash from a 4-frames animated hand-drawn texture synchronized with the ripples, a simple circular texture which raises its size while being more invisible. The ripples are also created by 4 images, but not animated, randomly selected and also masked with a panning noise to avoid repetition.
In order to achieve the motion in the particle emitter, one needs to change “Interpolation Method” to “Linear blend” in the Required Module/Sub UV and set the number of rows and columns of the flipbook (2-2 in my case). Then add “SubImage Index”, select “Distribution Float Constant Curve” in the distribution, and assign at least 2 points in the curve. As this is a really short animation (0.3 seconds) I left it as linear with just two points.
The level has so many places with different shapes where I had to put this impact that I decided to expose the location and frequency of this effect to parameters. This means I can adjust each effect on each place. For example, the stairs have one emitter on each step. At the same time, I also needed to cover larger areas.
To do that, in the emitter, you have to select on each module that you need to be exposed to editing in the level in order to receive a “parameter” input. This way, each module receives the values when it is set up in the level.
The movement is produced in the particle system with “Initial velocity”. I use a distribution vector uniform with higher values on the Z-axis in order for it to move upwards. The module “Orbit” creates a chaotic movement around the Y-axis and rotates in all directions. The module “Drag” makes the embers reduce their acceleration. So, as the embers get altitude the speed decreases.
I used two emitters, one spawning at a constant rate and the second in burst mode. Both are a dot texture but the burst one is stretched by velocity.
As for the hit around, there is a node to set this up in Unreal material editor called “Refraction”. I feed it with a circular gradient texture as an alpha of the interpolation between two values. This effect’s very expensive for the graphics cards, so it’s really important to disable it when it’s out of the range. I achieve this by implementing LODs when the camera is far from the effect and I completely removed this emitter inside the particle system.
To create this effect I tried a new approach. I used meshes witch vertex animation.
This technology has been used in several games to animate a high number of assets without the need to use classic skeletons. I first learned about it from a talk by one of the tech artists of the game ABZU. They used vertex animation (together with other techniques) to animate hundreds of fish that, with the traditional approach, would have been impossible to move in real-time. They explain it way better than I do, so if you are interested here is the video:
Recapping, with this technique I baked the position of each vertex over time into two textures, one for normals and the other for UVs. These textures are read by the engine and the world offset position for each pixel of each frame. There is a tool included in the engine to create these textures (Vertex Animation Tool, provided by Epic Games). You can use this script in 3ds Max. Here you can find the information on how to use and set it up in Unreal.
In the gif below you can see how everything looks once I have the mesh and textures integrated into the engine. First, it’s only the animated planes, then I plug the mask and emissive. Next, I create different material instances for butterflies, moth, and birds changing just the images and the speed parameter.
In the material, I created a dynamic parameter that assigns random values in the particle system. This prevents all instances from moving together at the same time.
Raindrops on the Windows
This effect is not created during the course, but it’s a good example of how the theory which was explained in the course could be applied to create your own effects.
There’s nothing fancy, I just combined two animated textures with different speeds to create the effect of water falling. Also, a cloud noise works as a mask to show or hide some of the pieces of the effect. To achieve the refraction near the edges, I use a Fresnel function.
The course provided a deep understanding of the overall VFX creation as well as taught me a useful workflow to create my own effects. The content of the course is really useful, but the live QA sessions were the best part of the course. Receiving direct feedback from an industry veteran is priceless.
The biggest difficulty for me was the creation of textures for the effects as I wasn’t familiar with tools like Photoshop. But I learned how to create my own textures and optimize them for VFX – make them tileable, pack textures, generate noise, etc.
If you’re new to VFX, you are going to spend a lot of time tweaking the effects (and I advise you to do so). It surprised me how much time I needed to achieve the look I wanted. Besides, you’ll need to learn many new tools, so don’t be afraid of playing with them to improve your workflow and achieve better results.
Learning Houdini might be a good complement to this course. It is a powerful tool that can help you to enhance the quality of your effects and it’s becoming a standard in the industry. Lastly, I would suggest to check out www.realtimevfx.com forum. There is a growing community that is sharing knowledge, discussing, and providing good feedback and advice.
Making Electricity, Waterfall, and Dust VFX in UE4
Interview with Kidman Lee
Hi, my name is Kidman Lee and I am from Hong Kong. I started doing 3D Art 3 years ago at the University of Hertfordshire. In the second year, we had a class of technical art and I made my first FX there. It was a very simple magic circle effect that was made by following Dean Ashford’s tutorial.
I was so satisfied with the result that I wanted to do more. I started searching for different tutorials and tools on the marketplace to learn how other artist made their effects.
In the third year, I worked with my groupmate on the final year project, Project Star Bounty. I mostly made environmental VFX like fire, water, sprites for the cinematic view, etc.
As I said, I watched many tutorials online and it led me to the CGMA course FX for Games mentored by Fabio Silva. Most of the online tutorials only teach you how to make this or that effect but do not introduce you to the reasons behind the technical decisions, so you just copy. CGMA’s course was a different story.
To begin an effect, we need to do research first. What I had found on the internet was a tutorial on how to make a flash of lightning in the material shader by Yoeri Luos Vleer.
And here is my material node for the Electricity effect:
I am using my own texture created in Photoshop. In the material shader, I have used two textures: one got three different patterns in RGB and the other only got one pattern in RGB. Let me show it to explain everything better:
The dissolve makes it flicker:
Then, I’m setting up the particle system with Beam Data. Make sure you have the Target and Source in the Emitters column.
Here is a trick on how to create collision and the sprites:
In the image above, you can see that there are two emitters: the first one is for the collision, so make it invisible and add an ‘Event Generator’. In the second emitter, you need to have an ‘EventRecevier Spawn’ so that when the first emitter hits any object, the second emitter responds to it and spawns the sparkles.
Make sure to choose the type ‘Collision’ and set up a Custom Name.
Put the spawn rate you want.
*Here, put the custom name you have set in Event Generator, otherwise it won’t work.
It is a very simple and basic blueprint which gets the location from the collision to spawn the electricity – that is why we must use ‘On Particle Collide’ (Events).
Here, I want to talk about the waterfall type of effect because I have spent quite a lot of time doing it.
Back in September 2018, I worked on a waterfall for the first time and I had no idea where to begin. The very basic way I could think of is using texture in the material with translucency and then set up a particle system to make it spawn.
By doing a lot of researches, I now know there are different ways to do a waterfall. What you need to do is to find the most comfortable one to make a good-looking effect.
My waterfall consists of four things:
1. Base waterfall mesh
2. Top Splash
4. Bottom Splash
This is my waterfall material which isn’t hard to make. A good texture affects the result greatly. I made a texture for the side edge of the waterful, but be careful and watch the sharpness of the edge when using the power and multiple nodes.
Material for the Splash:
Foot Dust Effect
During the second lesson, in particular, Fabio taught us how to create our own texture in Photoshop. This is the texture I made for the foot dust. There are four shapes of dust to create variation.
Foot Dust Material:
I connected the texture to Particle Color so I could control the color and alpha (like opacity) inside the particle system.
I created a particle system for each animation, for example:
Then, I attached it to the character animation and adjusted the timing to make them perfectly match.