Atlantean Ascent

Trailer coming soon

by

Loose Nails logo here



Summary

Role: Graphics, physics, editor, visual scripting

Reference game: Remnant II

Game Engine: Aurora

Level Editor: Unreal & Borealis


My contributions

We had wanted to implement this sooner, but there wasn't enough time. Render pipeline state objects are essentially objects that describe one particular render state, which has its own render targets, shader resource views, depth stencil, blend state, rasterizer state and more. This reduces the total amount of state changes for the GPU each frame.

It was a requirement of this project to use physics, we chose PhysX and integrated it into the engine, making equivalent components in our own engine and making a physics scene representation that was also connected with our own play and stop system.

After some work custom shaders support had to be restructured and rewritten because of the new automatic instancing system didn't take custom shader using meshes into account. A new vertex shader was needed and the renderer needed a bit of work for everything to work smoothly and the vertex shader needed to be reset to the new default one if a custom one was used previously.

Made particle system update less frequently when out of view, added world space support and made particle systems parentable to other Game Objects with help from Dino Zvonar. Also optimized particle updating to not take as much frame time.

Using the base that we were given by TGA, Isak and I implemented visual scripting in our own engine with quite a bit of functionality, adding nodes as we went along and the need for additional nodes arose.

A very simple but helpful timescale slider and added settings for distance fog so that you can test in the editor and save the settings in our own scene format.

The game

Atlantean Ascent is the seventh project which is currently still a work-in-progress. It is a third person combat shooter game based on Remnant II. Our animators wanted to make a game that could make use of motion captured animations.


We tried to under-scope this project as much as possible, as other projects were still unfinished and needed more work. So the plan from the start was making one enemy at first, then seeing how much time we have left and perhaps add a ranged enemy as well. A problem with this approach is that level design really needs to take ranged enemies into account when designing levels.


Story and setting

The player plays as a fisherman who somehow got lost at sea and got transported to an underwater atlantean city. This city is filled with hostile fish-like humanoid creatures. The fisherman has to use his harpoon gun to combat these inhabitants and find a way to escape his fate.

My role and contributions

Since we had a lot of work still left to do on some previous projects I decided that I would try to do whatever is necessary. So my role in this project was somewhat diminished by all the extra work I had to put into Deliverance and Spite: The Summoning. The engine felt like it was up-to-par for this project, mostly, so that's why we chose to put one engine programmer on finishing the other projects.

 

It was also during this project we decided upon trying to ditch Unreal engine entirely as it was starting to become more of a hindrance than a help. So we tried to improve our own engine and editor to the point where we could be able to completely sever our reliance on Unreal. Our level designers said they would be more than happy to work in our engine and skip the Unreal engine middle-man.

 

Until now we've been using Unreal engine to build levels, exporting them as a .json file and then reading from them to build an equivalent level in our own engine. This means that a lot of the functionality that our editor has cannot be used properly, because as soon as we save our scenes, they're saved in a different format and we have not yet built a good system to combine a .aurscene with an exported scene from Unreal. We've spoken about it in our programming team, to essentially load a .json additively and just add the game objects to the current scene, but this would still be an imperfect solution.

 

We also needed to introduce level designers and technical artists to our script graph editor system.

 

With this in mind, some of the work has been quality of life related and adding additional editor functionality.  Adding additional nodes for the visual script graph editor and further optimizing rendering the scene.


At the very start of the project Isak Morand-Holmqvist and I started out working on the visual script graph editor, directly into our own engine from the very start. Other than that, another requirement for this project was having some kind of physics engine and the suggestion was to simply use PhysX, so right next to working with the script graph editor I also worked with integrating PhysX into our engine, making sure our own engine had components to make PhysX work and coding the interface for our physics engine which runs using PhysX. Currently we support capsule colliders, box colliders and sphere colliders. All of which can either be static or dynamic if they're given a rigidbody. As for the player, they're using a special Character Controller component, which works a little bit different from a normal collider. I didn't work as much with this, only fixed a bug with the player being unable to climb slopes, the rest of the implementation of the Character Controller was done by Isak Morand-Holmqvist.


As for the script graph editor, we added lots of basic nodes, and some more advanced, all to help our level designers to make some new interesting gameplay without further assistance from the programming team. A short, but incomplete list of nodes would be:

  • If statement
  • For loop
  • Timer with on elapsed and while running execution
  • Translate position, rotate, look at, rotate around
  • Vector2 and vector3 lerp
  • Add component (mesh renderer and visual script)
  • Modify component - play animation, set light intensity, toggle light, activate particle emitter
  • Math nodes - Add, subtract, multiply, divide, cos, sin, tan, atan2, get random float from range, less than, remap (0-1)
  • Game object nodes - Destroy, instantiate, compare is same game object, get game object by name
  • Play audio


As I said, this list is somewhat incomplete. I had a hand in making many of these nodes, but not all of them. Mainly the ones I didn't make myself or assist in making was the math operator nodes as well as the activate particle emitter one, which our technical artists wanted to make in order to learn our system.


Early on, a big structural change we wanted to do was to implement something we call "render pipeline state object" which simply explained is an object that contains all information needed for rendering a certain stage. This includes which shader resources, render targets, which shaders to use, depth stencil state, blend state, rasterizer state, input layout, vertex stride and topology. This doesn't say much to most people so basically, the idea was to minimize the times the GPU has to change states, which costs performance, and to make it easier to everyone to add a new pipeline stage to render something. It took quite a while to implement, but eventually it made working with the engine much easier with a small performance increase as well.


Long live the render pipeline state object!


This in-game footage is from a build where the character model was just updated, but the material hasn't been updated yet, resulting in a funky look for our player character.


After some optimization work by Isak Morand-Holmqvist in the SceneRenderer some problems were introduced with objects using custom pixel and vertex shaders. What he had done was make automatic instancing of meshes. This meant that every mesh was treated as the start of an instance, even if there was only one mesh in the instance. Since I usually had a close connection with the technical artist duo I took it upon myself to fix this problem. It was largely down to a small oversight, but it turned out to be quite a bit of work. Basically all of our standard deferred rendered meshes were now using the DefaultInstanced vertex shader, except for the skinned meshes which still needed to have their bones send to the GPU. The new system saves bone data directly in the vertex and only sends up the full bone buffer with 128 4x4 matrices when rendering a skinned mesh, rather than all the time. 


Previously we used an ObjectBuffer for transforms, inverse transposed transform among other things. The new system left the forward rendered part of the renderer unchanged, which caused odd behavior because the object buffer wasn't set properly. The issue was resolved by creating a new DefaultStatic vertex shader that forward rendered meshes would use by default. This was of course a slight problem, because we can't have animated meshes using custom shaders, but this was deemed an acceptable compromise.



The UI graphics is also currently not yet finished, so we're using a placeholder texture.