Trailer

by

Loose Nails logo here



Summary

Role: Graphics, engine, editor and systems

Reference game: Diablo III

Game Engine: Aurora

Level Editor: Unreal & Borealis


My contributions

We restructured the renderer from the last project, making it easier to work with and more optimized. We started using a graphics command list which essentially "queues up" work for the GPU to do.

Now that the engine also had an editor, we needed to manage which scenes were used for playing a "build" of the game. I coded a simple scene manager which keeps track of which scenes to build with. It had to support both our own .aurscene format as well as .json.

The game object and component system from Deliverance was not easy to work with and was not great for performance. Because we needed our programmers to be able to work we quickly rewrote the system from scratch.

During this project I really wanted to have particle effects in the game. With some help from Isak Morand-Holmqvist I wrote a particle system and made an editor in our inspector and also made it possible to save presets of these particle systems so our technical artists can create effects and save them as presets.

While making Deliverance we managed to get by by rendering the entire scene at all times, even objects that were not in view. This was simply not feasible in this project so I took it upon myself to get frustum culling to work. I also made culling for shadow-casting point lights.

It was a requirement for us to have visual effects in this project, it could be either particle effects or custom shader meshes. We wanted both, so I helped add support for custom shaders.

When we finally got around to making the menu I rewrote the screenspace images from Deliverance then coded the entire menu as well. Then we needed health bars and for that I coded world space sprites that have the option of being billboarded towards the camera.

The game

Spite: The Summoning is an ARPG hack-and-slash style game mainly inspired from Diablo 3. It was the first real test of our engine and it was also where our editor, Borealis, was born. A requirement was having different types of enemies, after a lot of meetings and brainstorming, we landed on having a common "popcorn" enemy which swarms the player, a heavy enemy with a large health pool with a special ability of hitting popcorn enemies, healing itself in turn. Finally, we needed to have a boss, which we mainly based on Belial, the lord of lies, from Diablo 3. We tried to not over scope too much, but with a decent amount of time on this project, looking back, we got a bit too ambitious and didn't cut enough content.


Because we needed to get working and didn't want to bog ourselves down too much in minute details we chose to have a relatively simple plot yet again. You play as a cleric who comes to an old town to defeat a demonic invasion which originates from underground, underneath the city in the catacombs.


Challenges

We had a lot of time on this project, but there was a lot of things to do. In the previous project, we barely had enough framerate to make the game playable, even with a decently powerful PC. Now we needed much larger scenes, populated with decorations and enemies that needed to run their AI loops and pathfind on a nav mesh towards the player. We needed some sort of VFX, thus the need for objects using custom shaders arose and I also wanted to implement a particle system on the CPU for our game.

 

In the last project we didn't have an editor which resulted in testing new features being difficult. We didn't have any debug drawing either, because of that, it made seeing what happened when testing harder than necessary as well. Since the engine was coded from scratch in 4 weeks, a lot of it had to be restructured and rewritten, especially the Game Object and component system. We argued a bit about using a finished library called entt, but ended up making our own system because we simply thought it was cooler to use and improve our own system rather than using a finished product.


A testing work-in-progress gif showing off a scripted event which changes the color of the players abilities.

My role and contributions

My main role as an engine programmer was fixing various problems with the engine, optimizing the rendering, and restructuring to allow for new features, among other things. Naturally, I also assist my fellow programmers whenever they encounter a problem or a mysterious error message they cannot quite decipher.


Game object & component system

First off we needed to rework the Game Object and component system quickly. Our main focus was making the system easier to work with and having lots of ways to find game objects, saving them with unique IDs and getting components efficiently. Later down the line we had a bit of a funny bug where when we compared component types we accidentally created an entire new component of type T in a templated function when comparing, which we did every time we did TryGetComponent. It wasn't noticeable first because most components didn't have enough data to leave a mark on the frame time, but eventually when the particle emitter component made its debut, that's when we noticed a major drop in frame rate. It was fixed by adding a static GetComponentType enum to each component.


New renderer

 Restructuring the rendering then took priority. During the making of Deliverance, we just barely made rendering work well enough to function. There were a lot of make-it-work solutions that were beyond revolting to look at. We separated rendering from the Scene object to the new SceneRenderer object, which was easier to work with.

We struggled a bit with a stupid bug initially. The clear color was transparent, so it showed the background of the editor. In the first image in the work-in-progress gallery you can see a very slightly tinted cube mesh that we first didn't notice before running the graphics debugger. After that we got things working again, testing using Deliverance props. Then we made it so the user can press F6 to cycle through different textures for debugging, such as albedo, normals, material, emission, vertex normals, wireframe and object ID.


Scene management

Further on, we needed to rework how scenes were handled. In the previous project, the scene wasn't so much a scene object as it was just the game world in its entirety. We needed to be able to open different scenes because we wanted to have our editor have a "play mode" so we could "play" the scene like in Unity and then go back to the scene as it was before play was pressed. Scenes also needed to be handled like assets. Eventually, we wanted to be able to add game objects in runtime, find game objects in the scene with either a name or an ID, and transition between scenes during runtime, which required a Scene Manager. Because this project relies on exported scenes from Unreal, we wanted the ability to choose some environment settings for each scene in the final build. 

 

The solution ended up being rather unsophisticated, but it worked. Essentially, level designers and graphical artists chose what settings they wanted for the directional light, the cubemap, and the ambient light strength, as well as the height fog created by our talented technical artists. That data was then entered in a struct and saved per build scene index. So whenever a scene was to be opened, I first needed to check if that scene existed in the Scene Manager, then load the appropriate scene settings; otherwise, load the scene as usual without any special settings. Hindsight being 20/20, I realize I could simply have exposed these settings in the buildscenes file so the level designers and graphical artists could do it all by themselves.



Particle system & editor

I also made the particle system and an editor for the system, for this project. Adding features as I went along and as requested by technical artists. It was a lot of fun both to make the editor and to create effects using my own system. With a gradient class made by Isak Morand-Holmqvist I added it into the editor so you could add color and alpha keys to make a particle lerp between different colors and alpha values.

 

I was acutely aware that there was a risk with this type of particle system eating up a lot of performance, so I warned our artists not to be too bold when it comes to spawn rate. However, it turns out that it was still fairly capable, rarely putting much of a dent in the frame time. Still, we made the system able to only update when in view and made it dynamically calculate it's own render bounds. I also added functionality for the system to always update, even when not in view, to not have effects look like they react upon being viewed. It worked very well, there is a much less frequent update running when the system is not in view it always simulate is active.


Culling of all kinds

But what use it having render bounds for the particle system if it isn't being used anywhere? Before all of this I put a lot of work into making the camera classes, one for the editor and one as a component for viewing the world in play mode. Both of these objects inherit from their base class CameraBase. It was a lot of fun to make because we could use both cameras independently, but they would share a lot of functionality, making the camera class easy to tweak or expand. Not to mention, due to polymorphism, it made handling rendering easier, as all I needed to do was designade which CameraBase I wanted to use for rendering.

But my main reason for making the CameraBase class is to be able to start frustum culling and further optimizing the culling itself. Frustum culling in essence, is a method for detecting whether or not a mesh is in view of the camera. Rendering things takes time, both for the CPU and the GPU, if we only look at 20% of the scene, why render the full 100%? It was simply not feasible to go on without frustum culling, it was essential for the performance.

 

It took me a while, but eventually I managed to implement frustum culling. I ran into some trouble with meshes not having proper bounding boxes, but I converted it all into spheres, because it's cheaper, performance-wise, to cull them. How it works is that I generate the frustum of the camera, split it into six planes, then I check if a mesh bounding sphere that is to be rendered is inside this frustum. For this project, we figured most of the map would be to the left and right frustum planes. Then top and bottom planes, finally near and far. If a mesh is fully outside any one of these planes, we can skip any further calculations, because we know the mesh is not in view. By checking the most likely planes first we can rule out having to check the rest of them, thus saving a couple of plane vs sphere collision checks for each mesh.


Menu, HUD, images and sprites

Finally, as the project is nearing completion, I coded not only screenspace images, but also world space sprites with an option for billboarding. I went on to code the entire main menu, the pause menu, level select, options menu and credits screen. It wasn't the most fun I have ever had coding, but when looking at the result I'm still fairly pleased.


Somewhere in this mess I also helped optimized the navmesh parthfinding originally coded by Dino Zvonar. He had quickly rushed it out at first, trying to get both pathfinding and path post-processing to work by "funneling" which helps make the path look less robotic and dumb, to put it bluntly. During the course of all this make-it-work, he didn't find the time to make-it-fast. So I spent some time fixing some of the errors he had made in his rush, and then helped him solve a problem with dealing with height differences in the navmesh, such as when walking up a stair or slope. We had an idea of making a pathfinder manager, which would take in path finding requests and then use a threadpool to calculate these paths when time could be found.


It's far from perfect, but knowing it's optimized enough to not destroy the frame time completely, we decided to put our focus elsewhere.