Final Project Write-Up

This is the Final Project for our Game Engineering Class.
In this assignment, we are required to make a game on the game engine we built in this semester.
The major purpose of this assignment is to show off the asset builders’ ability, as you can see in the final product, there will be multiple different models, materials, textures, etc.
For my final project, I decided to make Joust.

Here is a gameplay video:

Download builds (Joust)
Direct3D
OpenGL
Control:
Escape to close
Left/Right Arrow key to control movement direction
Tap Spacebar to fly

Rule:
Hit enemy from above and collect their eggs once they are dropped.
If an egg is left unattended for 5 seconds, a new enemy will spawn from it.
Falling into the lava will also cause game over.

Before making the game, I also implemented diffuse and ambient lighting in my engine.
And during the process, I realized that if an uniform is declared and not used in the body of a shader, the builder will remove the uniform from the shader on compile, which would cause the code fail to retrieve the uniform handle.

Most of the engine components I used in this game were from previous assignments from this class.

Since the asset builder system we built were pretty robust, I was able to
create my own models and textures and import them into the game without much efforts.

Textures I found online and made in photoshop

Textures I found online and made in photoshop

Knight Model I made in Maya

Knight Model I made in Maya

I was also able to create a new type of animated material (as you can see in the game, the flowing lava is achieved my animating the UV value of the mesh) by adding a new vertex shader.
In doing so I realized that any time-sensitive shader probably need to know the total time in second since the game started so I added a built-in handle to handle that.

But that was majorly focused on the asset builder, graphic side.
In order to make this game, I also need to pull some stuff from my engineering class from last semester:

Smart Pointer:
I created a “world” class that content all the game objects including basic info like world transform, renderables, colliders, etc. which are also referenced in their own systems. Therefore I used smart pointer for them cause in that case I don’t need to worry about forgetting to delete a reference in one system and also avoid the pointer dangling problem.

AABB Collision detection:
I created the bounding box from the mesh file I created for this assignment. Under the assumption that the mesh will always be axis-aligned, I simply loop through all the vertices in the vertex buffer to get the minimum x, y, z and maximum x, y, z and then create the bounding box by subtracting the maximums by minimums.

But the world and game object parts are relatively less well-structured, so if I had more time, I will make that more well-structured so I can implement more complicated game logic easier.
I also wanted to create a game object/transform hierarchy like most of other game engines so I can animate the body parts of my knights.

The major things I learnt from this class were how to setup a project in visual studio properly, it might be a painful process in the beginning where you have to setup multiple projects, setting dependencies, trying to figure out the library linkages, etc. But it all proved to be worthy later when it all come together nicely when you are actually using it.

I also learnt how an asset pipeline works in a game engine, at least in the scope of the engine we are building, which I didn’t have a clear idea before. Although I am aiming to be more of a gameplay programmer instead of an engine programmer, I think this knowledge is still important because having some insight of how the tool works would definitely help improve efficiency and reduce the chances of making stupid mistakes when using the tool.

As we all know, every engineers have their own coding habits, it’s impossible to unify them. In my mind a good coding habit is try to structure the code nicely, with clear comments, naming and declarations, try to minimize repeated codes, etc. The ideal case is that someone without much knowledge of your coding style should be able to spend less than a day to understand the brief idea of your code and able to follow your structure and work on that.

Which in real life it’s really hard to achieve due to deadlines and human elements (the laziness in most engineers’ blood including myself). But that would still be my goal, to write code that is clear and easy to work with.

EAE6320 – ASSIGNMENT 13 – WRITE-UP

Download builds
Direct3D
OpenGL
Control:
Escape to close
Cat Movement: Arrow Keys + Q or E to rotate
Camera Movement: WASD -> horizontal movements, LCtrl and Spacebar -> vertical movements

For this assignment, we are adding texture features to the material components.
Assignment13-3
Screenshot of the human-readable material file.
Assignment13-1
This is a screenshot of the updated binary file of a material file.
The highlighted part is the new data of the texture.
The first byte indicate how many textures are there in this material, which is 1 in this case.
And then followed by an array of texture data, including the texture path and the sampler uniform name both ended with a null terminator.

Assignment13-4
To read in the texture data, I first check if the cursor is at the end of the file after I read in all other information from the material file (since texture is optional, it is possible that there are no texture data exist in the file)
If so, I would read in the first byte as an uint8_t, which is the texture count, and then loop through the rest of the file to get the texture data.

I made another class named GraphicTexture for storing the texture data.
Since there might be multiple texture data in a material, it is more manageable to put all the information in a class and manage them as a single entity.

Finally, a screenshot of the final product(Left: Direct3D, Right: OpenGL)
Assignment13-2

EAE6320 – ASSIGNMENT 12 – WRITE-UP

Download builds
Direct3D
OpenGL
Control:
Escape to close
Cat Movement: Arrow Keys + Q or E to rotate
Camera Movement: WASD -> horizontal movements, LCtrl and Spacebar -> vertical movements

For this assignment, we need to introduce a material system into our engine.
In previous assignments, the color on the models are solely depending on the vertex color stored in vertex data in the mesh files.
So if the user want to create same mesh with different color, he/she needs to create another mesh file with corresponding vertex colors.
With materials, we can reuse the same meshes and apply different color via different materials.
We can also share same material across different meshes to have a united look.

Untitled
This is a screenshot of my human-readable material file.
effectPath is the relative path of the effect file.
uniformData holds different uniform information.
name – uniform name
shaderType – type of shader, vertexShader/fragmentShader
values – values of uniform
textureData is not used for this assignment, it’s a placeholder for future extension into textures.

Assignment12-1
Assignment12-1
Above 2 screenshots are the release and debug version of the binary material file.
The file starts with the effect path ended by a null terminator.
The following 1 byte is holding the number of uniforms (uint8_t).
And then followed by the corresponding number of uniform data.
uniform data is structured as follow:
first 8 bytes(4 bytes for OpenGL) – uniformHandle – is stored as 0, a placeholder to make reading easier, so user can assign to the structure directly.
next 1 byte – shaderType – an enum identifying which shader type.
next 3 bytes – memory alignment padding (uninitialized garbage bytes in this case)
next 32 bytes – an array float of length 4
next 1 byte – number of values to set, decide how many float value in float array is valid
next 3 bytes – memory alignment padding (uninitialized garbage bytes in this case)
And the rest are the names of the uniforms separated by null terminator.

I stored the data in this order in order to make the reading process easier.
By storing the effect path at the beginning, I can load the effect before reading the uniforms.
And naturally the number of uniform should come before the actual uniform data and the uniform names.
Storing the uniform data before the uniform names allow me to get the handles and store them directly into the uniform memory instead of storing them somewhere else to prevent being overwritten.

As you can see in above screenshots, the binary files of debug and release version are not identical.
The difference are the memory alignment padding parts because they are uninitialized memory blocks.
We can fix that by initializing the memory block to 0 via memset before allocating data into the memory and write onto a file.

Untitled1

And the number of bytes used to store the handle is also different (4 bytes in OpenGL and 8 bytes in Direct3D).

Assignment12-3
And here is a screenshot of the game:
From left to right: a cube with red opaque material, a cube with blue opaque material, a cat with white opaque material, a cube with green transparent material (opacity 0.5), a cube with white transparent material (opacity 0.2)

EAE6320 – Assignment 11 – Write-Up

Download builds
Direct3D
OpenGL
Control:
Escape to close
Cat Movement: Arrow Keys + Q or E to rotate
Camera Movement: WASD -> horizontal movements, LCtrl and Spacebar -> vertical movements

For this assignment, we need to create a maya exporter to export a maya model into our human-readable format and load that into our game.
We also need to add the render states into our effect class.

For the MayaMeshExporter project, I set Windows as one of the dependent project for its build order cause it includes “../../Engine/Windows/Includes.h”
Although I tried building it without setting up the dependency and it still works fine, I would assume it is going to need some of the windows functionality in the future expansions.

Assignment11-1
Once the exporter is built, maya would be able to access it via the environment variable we setup earlier (MAYA_PLUG_IN_PATH), the above screenshot shows my exporter plugin being detected in maya’s plugin manager.

Assignment11-6
Another nice thing about the exporter is that you can attach visual studio to maya and debug the exporter on the run.
So you can set breakpoints and debug your exporter when anything goes wrong.

Assignment11-0
So now we can put much more complex objects into our game. (Cat!)

Assignment11-2
This is our human-readable effect file, with the new render states included.
They are 4 booleans specifying which state is enabled or not, which is more understandable than 0s and 1s.

Assignment11-3
And in the actual code, the render states are stored as an uint16_t, and the booleans are stored as bits.
Each bit of that uint16_t values is representing a boolean value, which takes up a lot less storing space and also easy to access.
In this case, the first bit is representing Alpha Transparency, second bit is representing Depth Testing, third bit is representing Depth Writing, forth bit is representing Face Culling.
So if you want to enable Alpha Transparency and disable all other features, the value of the render states will be 0001, which is equivalent to 1 in integer.
And if you enable Alpha Transparency and Face Culling while everything else stay disabled, the value would be 1001, which is 9 in integer.
We can check the value easily using bitwise operations.

Assignment11-5
Assignment11-4
The above 2 screenshots are the binary versions of my effect files.
As you can see, the first 2 bytes are reserved for the render states while the rest of the file stays the same as previous assignments.
For the normal opaque effect file, the render states is set to 0xE, which is 1110, which means every render states, except alpha transparency, are enabled.
As for transparent effect, the value is 0xB, which is 1011, which means everything is enabled except for depth writing.

Since the size of render state is constant (sizeof(uint36_t)), I decided to put it at the beginning of the binary file so I don’t need to worry about getting the wrong offset, because the render state could be all zeros, which could be confused as null characters that we use to identify the end of a string.

In this assignment, we also introduced alpha transparency.
In RGBA system, RGB represents the diffuse colors (Red, Green and Blue) and A represents the Alpha value, which is the “transparency”. When alpha is set to 1.0/255 (depends on your representation), it means that object is opaque or not transparent, and when alpha is set to 0.0/0, it means that object is total transparent/invisible.
For example, assume the color of a transparent object with alpha value _a(using 0.0 to 1.0 scale) to be color_t and the background color to be color_b.
The resultant color would be color_t * _a + color_b * (1.0 – _a).
For that equation to work, we have to know what is the background color behind the transparent object before drawing the transparent object, therefore we have to render back-to-front for transparent objects.

Direct3D:
direct3dDevice->SetRenderState( D3DRS_SRCBLEND, D3DBLEND_SRCALPHA )
direct3dDevice->SetRenderState( D3DRS_DESTBLEND, D3DBLEND_INVSRCALPHA )
The first line specified source-blend is using source alpha (color_t * _a)
And the second line specified destiny-blend is using inversed source alpha (color_b * (1.0 – _a))

OpenGL:
glBlendFunc( GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA )
Similar for OpenGL, the two blending factors are GL_SRC_ALPHA (_a) and GL_ONE_MINUS_SRC_ALPHA (1.0 – _a)

Besides the basic alpha blending mentioned above, there are other kind of alpha blending methods like additive blending, multiplicative blending, etc. that can be used for different needs and situations.

EAE6320 – Assignment 10 – Write-Up

Assignment10-3

Download 3D cube with double-sided floor
Direct3D
OpenGL
Control:
Escape to close
Cube Movement: Arrow Keys + Q or E to rotate
Camera Movement: WASD -> horizontal movements, LCtrl and Spacebar -> vertical movements

In this assignment, we are going to move to 3D spaces.
Instead of using a 2D vector to specify the location offset of the mesh, it is changed to 3 different matrices:
LocalToWorld – transform the coordinates of the mesh from local space to world space in the game
WorldToView – transform the world coordinates of the mesh to the Camera’s local space / view space
ViewToScreen – project the view to the 2D screen space

Assignment10
Originally, the object is drawn base on their drawing orders, so no matter which object is in front in the 3D world space, the one drawn later is always on top of the others.
To solve this, we introduced the depth test.
What a depth test do is basically assigning a “depth-value” to each influenced pixel of the mesh base on its world position and compare them with the depth buffer of the scene on every render call.
The depth-value ranges from -1.0f to 1.0f, 1.0f represents the deepest depth.
On each render call, the depth buffer of the entire scene is reset to 1.0f.
While drawing a mesh, the depth-values of the mesh is compared to the corresponding value from the depth buffer,
if it’s lower or equal to that of the depth buffer, it means that specific pixel of the mesh is “in front of” of the existing pixel,
therefore, that pixel is drawn and the corresponding value from the depth buffer is updated with the new depth-value.
Otherwise, that pixel from that mesh is discarded on this render call.

Assignment10-1
We also need to update our mesh files to support the 3rd dimension.
Instead of having only 2 values in positions, a 3rd value “z” is added.
Assignment10-2
Since a new float is added for each vertex data, the binary file’s structure is update correspondingly.
The first 8 bytes still represent the vertex count and index count.
Then there is a list of 4 vertex data where the first 12 bytes represent the 3 x, y, z coordinates and the next 4 bytes represent the color rgba.
And the index data list stays the same as before.

In this assignment, I also added a new class Camera.
I put it in my Graphic system since it’s graphic related.
I put the camera object inside the GraphicContext, which is a struct of graphic related data that get pass into the graphic system on almost each graphic related function calls.
The initial position and rotation of the camera is specify by Graphic::Initialize() call, and then updated in Camera’s own update function.

I also added another floor mesh facing opposite direction of the original one.
Because otherwise if the player move the camera below the floor level, the floor will disappear due to back face culling and that looks odd to me.

Post IGF Submission

After couple weeks of crunching, we finally manage to submit our game before IGF deadline.
We’ve made tons of changes since last update.

This is our latest build’s gameplay video.

For this build, we also tried out some of the new Unity’s feature, Analytics.
With Unity Analytics, we will be able to gather more information about our players, and for this case, the IGF judges.
We will be able to know when did they started the game, which car did they picked to play our game, etc.
Those could be really useful information for us.

But there are still some problem regarding how Unity Analytics present those information.
1. As long as the information is a number, they will only present the total/average number of a day, but in some cases we want to get the number as an index, not a count.
2. No detailed event information, e.g. specific time of firing, etc.

EAE6320 – ASSIGNMENT 09 – WRITE-UP

Download Rainbow Square + Triangles
Direct3D
OpenGL
Control: Escape to close, Arrow Keys to move Rectangle

For this assignment, we mainly need to work on making the graphic related stuffs as platform independent as possible.
First, we start on making the Render() function under Graphic platform independent.

Assignment09-1
This the content of my new Render() after the modification.
As you can see in the screenshot, there are no platform specific code inside the function.
Although the detail code are different, both OpenGL and Direct3D’s Render() function do share some similarity on structure.
So it is possible to encapsulate the different steps into smaller functions and implement them in platform-specific cpps.

The next step is try our best to make our shader platform-independent.
Assignment09-2
The screenshot above shows the modified vertex.shader.
Besides the platform-specific Input/Output variables, other parts of the shader are merged.

Assignment09-3
In order to unify the platform-specific type name, I included “shaders.inc” to redefine those names for corresponding platforms.
I decided to keep the platform-specific variable variable type name the same in platform specific code, as for the sharing part, I decided to go with Direct3D style cause floatX makes more sense to me than vecX.

In order to make sure the Asset Builder knows it need to update the shader assets when the “shaders.inc” is changed even the shaders themselves are not modified, we need to modify the asset builder to take in the dependency.

Assignment09-4
To achieve this, I added another optional table in AssetsToBuild.lua named “dependency”, I also have an “optionalDependency” table under each asset just in case more specific dependencies need to be setup in the future.
And in “BuildAssets.lua”, instead of just checking whether the last write time of target assets is earlier than source assets, I also take the dependency files’ last write time in consideration.

Time taken: 3hrs.

EAE6320 – ASSIGNMENT 08 – WRITE-UP

Download Rainbow Square + Triangles
Direct3D
OpenGL
Control: Escape to close, Arrow Keys to move Rectangle

In this assignment, we need to apply the same changes we did on mesh to the effect files.
1. Create a human-readable effect file.
Assignment08-2
I divided the file into vertexShader and fragmentShader, and each of them has a member named path.
There are not much information needed to be stored in this file, therefore the file setup is pretty straight forward.
We might need to add more variables in this file in the future, but for now this is sufficient.

2. Design a binary effect file.
Assignment08-3
My binary file contains the vertexShader path and fragmentShader path, both end with a null character.
By ending both paths with a null character, I can load in the first path by casting the fileBuffer into a char* and then increment the cursor by len(firstPath) + 1 and get the second path by casting the new pointer into a char*.

3. Create a effect builder, import the shader builder
Assignment08-7
For my shader builder, I have a single shader builder, and specify whether it is a vertex or fragment shader by optional parameters.
With a unified shader builder, I don’t need to maintain 2 builders in the future and hence less probability to make mistakes.

There is a debug and release version of built shaders.
But instead of using the #ifdef _DEBUG to decide which version to build, we created our own define for this purpose.
It is because when we are debugging the shaders, we want to keep other parts of the game in optimized release build but the shader in debug version, and vice versa.

Debug Version of Vertex Shader (Direct 3D)
Assignment08-4

Release Version of Vertex Shader (Direct 3D)
Assignment08-3

As shown in above screenshots, the debug version is a lot larger than the release version because it contains a lot of unnecessary information like the original shader path, i_color, i_position, etc.
But the release version only contain necessary information for the game to run.

Debug Version of Vertex Shader (OpenGL)
Assignment08-5

Release Version of Vertex Shader (OpenGL)
Assignment08-6

And the difference if more obvious in OpenGL, in debug version, the shader file is basically the original shader file, took out the HLSL part and stored as a binary file.
As for the release version, all the comments are removed.
It is obvious that comments are not needed for the program to run, so it is omitted on release version to reduce the file size and hence the loading cost.

Time taken: 5 hrs.

EAE6320 – ASSIGNMENT 07 – WRITE-UP

Download Rainbow Square + Triangles
Direct3D
OpenGL
Control: Escape to close, Arrow Keys to move Rectangle

Assignment07

In this assignment, we are required to do 2 things:
1. Render 1 rectangle and 2 triangles, the 2 triangles should share the same mesh, which means if the mesh file of the triangle is changed, the changes should be reflected on both triangles.
2. Modify the vertex shaders to take in a variable to modify the offsets of the drawn meshes.
Although the result can be achieved by simply changing the vertexData of the mesh every frame, but that would cause a lot of problems, for example if we are using the same mesh for multiple objects in a scene, using vertexData to specify positions means we need to have separate vertexData for all of the different instances of the mesh, which is not desirable.
So we introduced another data called Uniform in the Effect class, which specify the offset of the mesh instance, and is feed into the vertex shader for drawing.

In this assignment, I created another class called Renderable, which encapsulated the mesh and effect data.
And the graphic project has a list of renderable, which could be modified by the “game” dynamically.
On each Render() call, the graphic class would loop through all Renderables in the list and draw them accordingly.

In future, I might need to refactor the code to further encapsulate the renderable class under another class like GameObject that hold most of the general information of an object in the scene.
But for now, I just put the input control and position update inside renderable class for simplicity.

While working on this assignment, I got most of the requirement done quite smoothly except one.
When I try to integrate the UserInput and Time lib provided into my projects, the frame delta time I got from the class fluctuate over time, causing the rectangle to snap on certain frames.
But if I keep moving the mouse on top of the game window, the problem seems to be gone.

After asking in the mailing list, I finally figured out that is because of a problem I planted into my code in previous assignments.

if ( !hasWindowsSentAMessage )
{
// Usually there will be no messages in the queue, and the game can run
// (This example program has nothing to do,
// and so it will just constantly run this while loop using up CPU cycles.
// A real game might have something like the following:
// someGameClass.OnNewFrame();
// or similar, though.)
}
else
{
// If Windows _has_ sent a message, this iteration of the loop will handle it.
// Note that Windows messages will take precedence over our game functionality;
// this is because if we don't handle Windows messages the window can appear sluggish to the user
// (if s/he tries to move it, for example, but we give too much precedence to our own game code).
}

This section of code in the main loop of the game is used to hold the game loop while there is a windows message raised.
Which means the game update functions should be called under if(!hasWindowsSentAMessage), but I placed my Render() call under the else scope, which did not cause a problem in previous assignments since they were all still images, calling the render function once will still yield to the “correct” behaviour.
Therefore I put all the new update and getDeltaTime functions under the same place without much thoughts.
But since this assignment requires moving object, placing the code under the “else” scope means the render and getDeltaTime function will only be called when there is a windows message raised, and mouse movement is one of the windows message, hence the weird behaviour in the video.

EAE6320 – ASSIGNMENT 06 – WRITE-UP

Download Rainbow Square + Triangle
Direct3D
OpenGL
Control: Escape to close.

In this assignment, we need to create a binary version of our mesh files.
Instead of using our original human-readable lua file, we moved the lua reading code from the graphic project to mesh asset builder and use the builder to convert the mesh lua to binary file for runtime loading.

hex

The above image shows the binary mesh file in a hex reader.
The first 4 bytes is the number of vertices, next 4 bytes are the number of indices.
Followed by the vertex data, which is an array of vertex structs which takes up 12 bytes each.
And then index data, which is an array of index which takes up 4 bytes each.

The number of elements in an array must come before the actual array, since we need to know how many elements we need to read before actually reading the elements cause there are no clear indication on what kind of data and how many of them is stored in a binary file.
I put both numbers at the beginning of the file so it will reduce the chance to get the wrong number due to wrong offset calculation.

By converting the lua files into binary files, the file size is reduced since no labels needed to be stored,
it also improve the runtime loading time because no parsing is needed after loading the data from the file,
all we need to do is just put the data into corresponding memory blocks and read them in correct order.
Another possible advantage is to increase the difficulty for the ordinary user to “modify” your game after shipping if you want to prevent that.

Although binary files are more advantageous on runtime, it is relatively harder to understand, therefore it is still desired to keep the human-readable files for debugging sake.

size-compare
As shown in the image, the human readable is taking 374 bytes while the binary version of the same mesh only takes up 80 bytes.

Since Direct3D and OpenGL takes vertex order differently, they also have different structure on color order, the built binary files should reflect those differences on different platforms.
The goal of using binary files is to improve runtime efficiency, so we made the binary files follow the platform’s rules so we don’t need to do the conversion in runtime.

readdata
This image shows the code used to load how to load the file data.

effect1
Another task of this assignment is to encapsulate the shaders into a Effect class to make the loading and using platform independent like what we did to the Mesh in previous assignments.
The header of the Effect class is shown in above image, you can see both platform can call the same function for loading and using of the shaders.