Final Assignment – The Game

Well, here we are, the final assignment. As part of this final submission, we needed to make a game using the graphics engine that we have created over this semester. The game that I chose to create was a re-creation of Rubic’s Journey, which we created as part of a 2-day game jam.

I have re-hashed this game to include assets from our thesis project, Maui. The game is based on a cube, where there are collectibles to pick up on each face of the cube. However, they need to be collected in a certain pattern in order to move forward. The pattern on each face(SPOILERS) is actually similar to one of many operators that we use in C++.


The controls are WASD to move the character around on the face of the cube.

The collectible models were actually created by our artists for our thesis game, and I thought it would be interesting to see if they can be imported in my graphics engine. As you can see, they worked very well in the game.

While I was making the game, I realized that while my engine itself was designed to remove instances of hard-coding wherever possible, I felt that my game code was not designed in such a way. We had introduced Lua files for almost everything in our Graphics library, like Meshes and Materials. However, when it came to positioning the objects themselves, they would be hardcoded into the source file. This is one of the things I would change by creating an Object lua file containing the mesh and the material to be used for each Object. An example would be as follows:


Another thing that I would be change would be the asset build system. I found whenever I needed to add an object in the game, I needed to add three things to my AssetsToBuild lua file: the Mesh file, the Material file and the texture used by the Material. If i forget to add even one of these assets, my object will not be rendered. I believe that this problem can be fixed by driving the Asset Build system via the Object file that I proposed earlier. The build system will process the Object file and determine which assets will need to be built. This, I feel, will make life a lot easier to add objects, where developers/designers will only need to modify ONE file.

JP’s class has been very useful in not only teaching different parts of an ideal graphics engine, but also how to architect such a engine, so as to make it easily usable by the developer who designed the system as well as other system developers/ designers. The most interesting part of this class was how it started out as simple assignments where we were more interested about making things look good than design, but later on, we were forced to change our design to make our lives a lot easier. It was interesting to watch ourselves get frustrated when we HAD to change our design, because we realized that all the hacks we used at the beginning of this semester arent simply going to cut it. We had to create a good design to drive the engine smoothly.

Over this semester I have learned that good design is very important when creating new structures in code. By good design, I mean, to judge what requirements may be required in the near future, so that we can provide functions to accommodate such requirements. The emphasis on “near future”  means that we should also do premature optimizations or add functions that MIGHT be required, but might end up never being used. This is a decision that I have struggled with, not being able to decide whether I might need a functionality in the future. I have always chosen NOT to add something, unless I needed it at the time. So far it has not backfired on me, but I think this point of view will change based on which project I work on.

And so, here is the game that I have created:

Hope you enjoy playing my game as much as I enjoyed making it!!!

Assignment 13 – Textures

This week, I will be adding support for textures for the models we export from Maya and use in our game. Changes needed to be done to enable this:

  • Fragment shaders

A new uniform variable called a “sampler” needs to be introduced which will sample a texture given a set of “texture coordinates” – which we will obtain from Maya – and render it onto the scene.

  • Texture coordinates

As mentioned before, a set of texture coordinates will be needed for the Graphics API to render textures on the screen. For this, the Maya exporter needed to be changed so that the (u,v) texture coordinates can be exported into the Mesh lua files. The Mesh builder also needed to be changed to write out  these values into the binary format.

  • TextureBuilder

A new builder tool also needed to be added that will process all the texture files provided as part of the Assets folder and convert them into a format easily loaded and rendered by the Graphics API.

  • Material files and Material Builder

The textures now needed to be associated with the Material files, along with the sampler uniform that will be sampling the texture, so that they can be processed(written into binary form) by the Material builder and loaded at runtime to be processed and rendered in the scene.

Here is the screenshot of one of my Material files with the alpha image as texture:

Binary mat with textures

To understand this format, I have taken a screenshot with a highlight near the texture count – in this case, 1. So the texture data will start with a texture count, which will let the Graphics API know how many textures need to be loaded, and each set of texture will contain the name of the sampler uniform and texture path.

  • Loading and setting textures

This is where a design decision comes in. I had to make a choice on whether to simply include member variables required to load textures – sampler ID and texture handle – as part of my Material class OR encapsulate the two into a different class.

Why I made a separate class for textures

The reason I went with the latter choice – making a separate class for textures – is that I figured that there might be a case where multiple textures will need to be assigned to the same materials, and it will be easier to iterate through a list of Texture objects, than iterating through two lists of sampler IDs and texture handles. While having multiple textures is not very common – I have seen our artists pack different textures into one PNG file – I still liked the idea of keeping the properties of a Texture separate and hidden from a Material.

I now have a pointer to a list of Textures as a member in my Material class, which will get populated at runtime by different graphics APIs – D3DXCreateTextureFromFileEx() for Direct3D and glCompressedTexImage2D for OpenGL. I will also be retrieving sampler IDs from the fragment shaders, since these sampler IDs will be used to bind/set the textures to these shaders during the Draw call.

A comparison screenshot of the OpenGL and Direct3D versions of my scene:


Thats all the assignments we have for this semester…. Next up, I will be creating a game with what I have learned and implemented over this semester, stay tuned for more…

Link to Direct3D executable:

Assignment 12 – Materials

This week, I will be adding material files as part of my asset list. These files will contain the color and the alpha values of the whole mesh. These values will be multiplied with the existing values that is input to the fragment shader. For this I have added a material Lua format to be used for all material files.

Benefit of using materials instead of just effects:

An effect simply lists the shaders to be used, and the render states that need to be enabled/disabled for the shader to function properly. A material can use this effect and control different values like color and transparency for fragment shader(and other values for the fragment shader). Simply put, a material files controls the input values that are sent to the different shaders that it uses. Once an effect is set up, a designer/artist can use/create a material value to try out different looks on different objects quite easily without worrying about the code working under the hood.

Here is my transparent material:Material

The effect property specifies the Effect file to use. The uniform section contains different properties that could be set to different values based on the effect required. The values section could be a single float or a float array based on the property being set.

As always, this Lua file needs to be converted to binary format to be loaded at runtime. The above material file will look as shown below in binary:


The first string represents the effect file path, followed by the count of uniforms that I need to look for. Once i get the count, I allocate enough data for the objects and start reading in the data. The first part of the data is the uniform name, followed by the uniform data belonging to the uniform.

In retrospect, I realize that although this sounded right in terms of understanding, where the uniform name and data should belong each other. If you had 100 uniform data in sequence, followed by 100 uniform names, it would be a little weird to track by going back and forth. However, it seems doing a memcpy for each object would not be a good idea for the same high count reason. I will need to change this for another time.

All this data gets read into a structure as shown below:


Once the binary files are read in, I set these uniforms individually during each draw call, since at these point I have access to all the uniform handles I need to set the data.

Each source material will create identical versions irrespective of debug and release configurations, however, it will create different sized files for Direct3D and OpenGL since the handle sizes are different for these platforms. However, the data will be the same, since I am setting the handle to NULL in both cases.

Here is how my game looks now!


Link to the game:


Stay tuned for next week(TEXTURES!!!!!)….

Assignment 11 – Maya Mesh Exporter

This week, I will be building a Maya Mesh Exporter, which will export meshes created in Maya in the Lua format that i use. We have been given a project which uses Maya libraries to extract the data that is provided for each exported mesh, so that I can translate this data into my specific Lua format.

What the MayaMeshExporter project’s dependencies and dependents are:

The MayaMeshExporter project itself doesn’t depend on any projects in the solution, and none of the projects in the solution depend on this project directly. However, without the exporter, the MeshBuilder would be pretty useless,, since there wont be any meshes to build without the exporter. But I don’t think this can be listed as a dependant, since the Exporter is not something that is used by the builder directly.

Here is a screenshot of my plugin loaded into Maya 2016:

Maya plugin mgr

When the plugin is used to export meshes, it provides a ton of information about the mesh, including the position and color of the vertices, as well as the index array indicating how these vertices should be used to render. This is all we need to render meshes via our C++ code, so this data will be written into the mesh Lua file inside our Assets folder.

Here is the screenshot where I debug the Maya plugin in action:

Maya debugging

As can be seen from the screenshot, each element in the vertex buffer contains a lot more information than just the position. Maya provides details of normals, tangents, UV and bitangents. At this point, we don’t use any of this extra information, and so I decided not to write this values into my mesh Lua file. If ever a need arises for this information to be used, I will have to modify the Mesh exporter to write this into the Lua file AND the MeshBuilder to read in the extra data. I might also need to change the sVertex struct in my Graphics library to accommodate this extra information, and add/modify any APIs that use this information.

Here is the screenshot of my scene:


The scene contains four meshes:

  • A colored soccer ball.
  • A floor where the ball rests.
  • A transparent pipe mesh.
  • A dodecahedron.

Each of these meshes have been created in Maya and exported into the human-readable Lua format to be used in the game.

To get the transparency effect, I have written a new fragment shader, which controls the transparency of the mesh using this shader. This shader is essentially the same as the fragment shader  used before, with one additional line of code which sets the alpha value of the color to a float ranging from 0 to 1, 0 being opaque and 1 being completely transparent.

An “alpha” value indicates whether a vertex will be opaque or completely transparent, or something in between. These values are generally used to create effects seen with mirrors(not the reflective ones, obviously), so that objects behind can be seen with a semi-transparent effect.

The reason we render from back-to-front is to enable this effect to be seen properly when rendered. When objects are rendered from back to front, it enables the objects(or part of them, like the soccer ball seen in the screenshot)  behind transparent objects to be seen in a transparent effect.

An additional process that is used to create this effect is “alpha blending”. This essentially combines two colors, a foreground and a background color. The process sees the transparency or the “alpha” value of the foreground color, and computes the “blended” color to be the average of both of these colors. So if the foreground color is completely transparent, the blended color is the background color, and if it is opaque, the blended color becomes the foreground color.

Now, to enable all these effects, we need to make some changes to the C++ source, where we need a structure called “render states”. A render state belongs to an Effect and determines how the Effect should be used to render objects using this Effect. For now, this render state will be a combination of three separate values:

  • Alpha transparency – boolean indicating whether to enable transparency or not.
  • Depth testing – boolean indicating whether depth testing should be enabled.
  • Depth writing – boolean indicating whether the depth buffer should change on each draw call.

Since each of these values is a boolean, we could have three separate boolean member variables inside our Effect. However, there are some disadvantages:

  • Each boolean takes up 1 byte, which is 7 bits more than what we need to determine true or false values.
  • Maintaining and adding more values is an issue, as it will increase the size by 1 byte each time we add a new render state, not to mention the readability issue. According to Direct3D, there are more than 100 render states. Imagine how it will look if we added 100 lines for all the render states we use!

In order to avoid these problems, we will be using an integer where each bit in this integer will represent one of the render states. Right now, since we have only three values to worry about, we will be using the uint8_t type, which is a 8-bit integer, using the rightmost three bits to indicate each of the three values. I have chosen the following order from the right:

Depth-Writing Depth-Testing Alpha-Transparency

and have defined each of these bits in the Effect header as shown below:

render state bits

This means a integer value of 3(or 011 in binary), will indicate that we should enable alpha transparency and depth testing, but not depth buffer.(Not sure if this could be a real scenario, where depth testing and depth writing are not enabled at the same time).

Now these values have to come from somewhere, and so I have added a “renderStates” section in my Effect lua file as shown below:

renderstates lua

The reason I created a separate section is to indicate a clear distinction between the paths to the shaders and the render states that each Effect uses. However, this section, or any of the variables inside these section is not mandatory in my format. Meaning that if this section or any of the values inside the section is not present, the EffectBuilder will simply use the default values(false for alpha transparency, true for depth testing and writing) for these states.

The binary format of both my Effect files are shown below:

Std Effect binary Standard Effect

Trans Effect binaryTransparent Effect

Highlighted in yellow is the byte indicating the render state values. In the standard effect, the render state is set as 6, which in binary is 110. Following the format mentioned above, we can see the alpha transparency bit is set to false or 0. However, in the transparent effect, the render state is set to 7 or 111 in binary. Here the alpha transparency is set to true or 1.

I have added the render state value at the end of the file because I would read in the vertex and fragment paths in sequence, and adding to the end allowed me to retain the code to do this AND add two lines of code to get the render state value. To get the render state, all I had to was increment the buffer pointer to this position and copy over the data to my render state member variable.

LInk to my game:


That’s it for now, stay tuned for next week’s update…


Assignment 10 – 3D Rendering and Camera

This week I will be adding 3D rendering(Yay!) and Camera objects to my source.

The easier part of this assignment is to properly read and write the z coordinate from the mesh file and have it render in the game window. The changes I had to make:

  • The vertex shader had to be changed to accommodate rendering of 3D vertices.(so all float2/vec2 variables had to be changed to float3/vec3)
  • Added another additional variable to the sVertex struct
  • Changing the Initialize call to indicate to the GPU that vertices will now be 3 coordinates.
  • Adding another value to the position property in the mesh file.
  • Changing the mesh builder to read in the extra z value.

In addition to this, the shaders also needed transform matrices.

Transform matrices and their roles

Each position is always represented in “local space”, meaning that it is relative to the center of the mesh. However, the GPU needs to know the screen position equivalent of these vertex positions in order to know where to draw them on the screen.

There are three transform matrices that are required to properly render these vertices.

  • Local-To-World : This matrix will transform the local position into the world coordinate system, or the “World Space”.
  • World-To-View : This matrix will transform the world space positions into something called the View Space(or the Camera Space) which is the viewing area of a camera.
  • View-To-Screen : This matrix will transform the view space coordinates to the screen space, which, in the end, is what the GPU needs to correctly draw the vertices on the game window.

To add these matrices, both the vertex shader and the source linking the shader needs to be modified. For the shaders, three uniforms need to be added to the vertex shader, namely the matrices mentioned above. The transformations need to be performed in the shader so that the screen-space equivalent of the local coordinates can be sent to the GPU for rendering.

However, these matrices will be set from the C++ code, in a similar fashion that I used to set the position offset in an earlier assignment.

Now this is all set up, it’s time to add a 3D mesh to the asset list, basically a box. The first four vertices will be the same as the square mesh file used before, with an additional coordinate of Z=1. The remaining four coordinates will have the same X and Y values, but with a value of Z=-1. This is the easier part.

The tricky part is the list of indices. While the front face triangles will be the same as before, the back face triangles will be using the opposite winding order in order to properly render. Same goes for each pair of faces opposite to each other. In order to figure out the index order, I had to manually establish the winding order of one face, and then reverse it for the opposite face.

In addition to the changes in the Lua file, the structure of the object(the Renderable in my case) also had to change to accommodate a position and an orientation(instead of a position offset as before). These two properties are essential in creating the first transform(Local-to-World).

Once the mesh was created, I added a Floor mesh for the cube to rest on. Here is the Lua file and the binary version:

floorFloor binary

To understand it easily, I have changed the columns in my hex editor to show four bytes per column. To check for the z coordinate check the last column, reverse the value(since it is stored in little endian format) and convert it from hex to float.

However, once i added this, my floor started to render on top – or more precisely, in front – of the cube. To fix this, I need to enable the depth buffer and depth testing.

Depth buffer and depth testing

The depth buffer and depth testing allows us to draw the objects in any order, and not worry about what will get rendered first, and on top of whom. A depth buffer contains data pertaining to the z depth of each pixel rendered. If two rendered objects overlap each other, then the depth testing is performed to decide which pixel is closer to the camera.

There are various comparison functions that can be used to make this decision as listed here: CompareFunctions

We, however, will use the LessEqual function to make this decision. This is because a pixel closer to the near plane has depth 0 and the far plane has depth 1, and we want to consider objects closer to the near plane. By using the LessEqual comparison, we guarantee that the objects closer to the camera will be rendered, and the ones behind will be obscured.

This is how the game looks after enabling depth buffer and depth testing:

Depth buffe

At this point, the position and orientation of the camera is hardcoded. However, as we know in most games, the camera control is given to the player. So I created a Camera class inside my Graphics library, adding a position and a orientation property, that can be modified by user controls. This camera object will now passed into the method which sets all the three transforms into the vertex shader, since the camera is need to create the Worid-to-View transform.

A Camera object will now be present inside the main Graphics source. I feel this is the right place, since the Camera – in my opinion – is an integral part of the Graphics class, where it is used to render objects in the correct view space by setting the transforms correctly. However, I have given  access to the Game code by adding a getter so that this object can be controlled by user input.

Here is the screenshot of the game this week:


That’s it for now, stay tuned for next week…

Assignment 09 – Platform Independence

In addition to the ShaderBuilder assignment, I will also be modifying the Graphics source file to be completely platform independent. This was a much awaited task for me, since I was focusing so much on making platform independent interfaces for the Mesh and Effect class, that the Graphics source kind of took a backseat.

Now the details of the Render function: there are three parts to the Render function(I think i got this right):

  • Clearing Buffers
  • Rendering Meshes
  • Presenting the Rendered meshes

The “Rendering Meshes” part is already platform independent since it uses the platform independent interfaces created as part of an earlier assignment.

The other two parts now need to use platform independent interfaces, so I created a Context namespace adding a platform independent interface for ClearBuffers and SwapBuffers. Of course, the implementations have to be platform specific, so I added two platform-specific files for the Context class which contains definitions for these interfaces.

Technically, this was the first half of the assignment, but I couldn’t let go of just making the Render function platform independent and leaving the rest as is. So i proceeded to make the whole Graphics source to be platform-independent, which meant the Initialize and Shutdown functions. Not that hard, considering that I had to move the code from Graphics to the Context class.

It was a great feeling that by doing this, I reduced my source file to a mere 100 lines compared to the 300 lines it was before. Of course, this code has moved into a different file, but those files can now be modified by a (hypothetical) D3D or OpenGL expert, and the Graphics source doesnt have to worry about it.

Here is my platform independent Render function:

Render PI

The second part of this assignment was to make the shaders platform independent as much as possible. This was a little tricky for me, as I am not very familiar with the shader language specifically its syntax. But it seems that the language follows a C-style language structure allowing ifdef macro definitions within the shader file. It also has provisions to provide dependent files as include statements.

An include file was already provided to us, which essentially “typedefd” all the data types from GLSL to HLSL to vice versa, so I proceeded to use it in both my shaders. However, the shader code also had to become platform independent, and for that I had to choose one format to represent my shaders: HLSL or GLSL.

I decided to use HLSL as the primary syntax for a purely personal reason: I have always preferred Microsoft over open-source languages. This is the same reason I feel comfortable using Visual Studio even when using Unity or any other engine.

Here is the platform independent vertex shader(mostly, though I dont think the shader can be made any more platform independent):

VertexShader PI

P.S. This screenshot is without comments; my actual files does have the comments.

As you can see from the screenshot, I have managed to also make the main function by making the parameter set only apply in Direc3D, and since OpenGL doesnt need this set, it will be empty.

The output position has also been “typedefd”inside the include file, “” as shown below:

shader include

Now, because the include file has been added, this file has to be treated as a dependency for the shaders. Meaning that these shaders should be rebuilt if anything changes in the include file.

The first change I made to accommodate this was to add a key in the AssetsToBuild.lua file which will specify all dependencies for an asset type. This can be seen below:

shader dep

The BuildAssets.lua file also needs to be changed to check for any change in the dependencies, if any, by taking into account the write time of the source and the target file. This logic was already present in the lua file, so I simply made changes to the function to check the dependency as well as the asset whether it needed to be built.

Once this was taken care of, the builders would automatically get called if either the asset or its dependency was out of date. This system is very similar to what I think Visual Studio does when I choose to “Build” rather than “Rebuild”. It was interesting to set this system up in Lua, so I really enjoyed coding and debugging this.

And that was it, the shaders and the Graphics source were now platform independent. All in all, a good week’s work… 🙂

Here is the link to the assignment:


Next stop, 3D!!!!!


Assignment 08 – Binary Effect and New Builders

This week, I will making the shader creation at run-time more data-driven by creating a Lua file listing the paths for the shaders. For simplicity’s sake, I have just two named values in the lua file, as shown below:

Effect lua

As mentioned before, the reason I chose this format was to keep it simple and human-readable. The null at the end was to simply terminate the string so that I can read the null character along with the string. My assumption turned out to be wrong because the only calls that Lua has for extracting data relevant to strings is the lua_tostring call which reads in the whole string, but does NOT consider the null character as part of the string. I had to manually write the null character after I wrote the string.

The Effect binary file:

Effect binary

Since all the characters in the path are ascii, it is pretty straightforward to read from the hex editor. The binary file contains both the paths to the shaders separated with a NULL character shown by a ‘.'(dot) character.

Once the binary file is created during build-time, these two paths need to be extracted from this file, instead of sending hard coded paths to the effect creation method.

Binary Effect extraction

I changed the declaration of the CreateEffect to accept only one parameter(instead of the two path parameters I had before) which is the path to the shader binary file. As you can see from the code, I take in the whole buffer by giving the whole length of the file. However, this buffer will “show”(during debug) only the first path, since it was NULL-terminated when being written into the file. I get the length of this path and set the seek point in the file to AFTER this path so that I can read in the second path. I now realize I maybe could have avoided this by moving the buffer pointer to a point after the path and then do a memcpy. But what’s done is done.

I feel that this has only reduced the hardcoding by one parameter. I think a better solution would be to have all paths in one lua file, store them somewhere during the initial start process, and reference them as needed.

The next task was to make a ShaderBuilder that would essentially pre-compile the text shader files into a binary format that can be easily and quickly loaded into the graphics device while creating the individual shaders. This project was already given to us and I simply had to modify the AssetsToBuild.lua file to use this tool to build the shaders. However, there was a slight different in implementation where Direct3D needs to know what shader needs to be built. I had to make a choice between having two different shader builders or use the single one, but passing in optional arguments.

I felt that using a single ShaderBuilder might be a better choice instead of having two separate ones, because I cringed at the thought of duplicating code and maintaining two separate projects. So I decided to send in optional arguments via my Lua file as shown below:

shader optional

The ShaderBuilder tool also has different debug and release configurations to build different versions of the shaders. This is controlled by a C++ macro – EAE6320_GRAPHICS_AREDEBUGSHADERSENABLED.

The reason we use a separate macro instead of just using _DEBUG which is the VS inbuilt macro for indicating debug builds, is to be clear on isolating certain “types” of code. By this I mean that whenever I want to section off shader-specific code, I will and SHOULD use this macro, and not the general macro, since these are not general tasks.


Debug DirectX Shader


Release DirectX Shader

From the screenshots, it is obvious that the Debug version of the vertex shader is bigger in size. This is because the command line contains a switch called /Od which is used to disable optimization. So the compiler will not optimize the code while building the shaders. Another reason is that when building in Debug mode, some compilers add debug information into the code to enable easier debugging. This is done by Visual Studio when it creates PDB files which contains debugging information for projects.

The same goes for OpenGL shaders as shown below:


Debug OpenGL vertex shader



Debug OpenGL fragment shader

As you can see from the binary versions, it shows pretty much the whole code in the hex editor. The only different between the debug and release versions is the comments section. The reason to include the comments is to simply make it easier to read and debug the shader file after it is built. This is not required in the release version because all the debugging that needs to be done will already be done in the Debug build.

Here is the link to the build:


Thats it for this assignment, stay tuned for more….

Assignment 07 – Position offset for meshes

This week, I will be adding a new variable called the position offset, which will be used to position different meshes in the game, instead of using the Lua files.

For this, the first thing to be done is to add a corresponding variable in the vertex shader, so that the vertex shader can take this into account while drawing the vertices in the game window. This variable will need to be modified – or more precisely, set –  on each draw call made from the C++ source. To do this, the code needs access to this variable from within the code.

I got the access for this variable in by getting a list of all global or “uniform” variables(as they are called in the shader code terminology), and then querying this table for the specific variable name. Once I have access, I can set this variable to whatever offset that I need it to be.

Now the objective of this assignment was to make a rectangle mesh controllable with user input. So the position offset will change with each user input and needs to be set before the mesh can be rendered. With these changes that needed to be done, I added a new class named Renderable which encapsulates the following three things:

  • a Mesh to be rendered
  • an Effect to be used while rendering
  • a position offset

This class also contains a Render function whose implementation is platform-independent, where I perform the same sequence of steps I did in the Graphics Render function:

  • Set Shaders
  • Set Position Offset
  • Draw Mesh


Now, the changes to the Graphics source: since the three main parts that we need to render a shape(Mesh, Effect and a position offset) are encapsulated within Renderable, I can remove the references to the Meshes and Effects I had before, and instead maintain a list of Renderables inside Graphics.


I maintain two lists of renderables: a user-controlled list which I can provide to the Game “world” to be controlled by user input.

The second list contains all the static objects that will be rendered in the scene.

The decision for keeping the list in Graphics instead of outside was right now, the things that are part of the object are graphics-related. If ever a case comes up when I realize that the parts of the game object that the game controls/modifies are different than the Graphics library, I will make the change to either move it to the Game application OR simply split the object into a section which is needed by the Graphics library to do its thing.

With the introduction of the offset, we don’t need to worry any more about setting the final position and they can be centered around the window center- (0, 0) – in the Lua file. The offset will take care of rendering the meshes where we want them to.

Here is the screenshot of the game this week:


The rectangle in the center can be controlled with the arrow keys.

Link to the assignment:



Assignment 06 – Binary meshes and Effects

This week, I will be creating a binary mesh format that will be created from the Lua file during the build process. For this, I will be moving the Lua-loading code from the Graphics library to the AssetBuilder system, specifically the MeshBuilder application. This changed some things in the solution:

  • Dependency on the Lua library moved from Graphics to the Mesh Builder project.
  • Changing run-time error messages in the code to build-time error messages
  • Making the sVertex struct accessible.

The last point lead to a design decision. This struct was in the private section of my Mesh class, and the easiest way to make this accessible was to move this to the public section, but then this also makes it accessible to other classes who include this header. While I am the only one who will be editing this code, I wish there was some way in C++ to make only certain parts of the class accessible instead of revealing the whole public section(would make abstraction a lot more “secure”?). For e.g. if a class needs only the sVertex, other public members should not be revealed. To my knowledge, this is not possible(maybe in other OO languages?), so I made the choice to make this struct public.

Once the Lua loading code was moved, the data that I got from the lua had to be converted to bytes that will be read by the Graphics library during run-time. The code that I chose for this is like this:

// Write vertexCount
char *vertexCount = NULL;
vertexCount = reinterpret_cast<char *>(&verticesCount);
meshBinary.write(vertexCount, sizeof(verticesCount));

This code essentially creates a char pointer(pointer to a 1-byte address) and points to the starting address of the vertexCount. The vertexCount itself is a 4-byte value, which means if I tell ofstream to write this buffer pointer with the size of the vertex count, it will write the 4 bytes starting from the vertexCount address. This approach is followed for the remaining data(vertices, indexCount and indices) as well.

My Binary Mesh format:MeshBinary

Shown above is my binary mesh format, where the order of data insertion is: vertex count, list of vertices, indices count and the list of indices. The reason I chose this order is because this is the order I maintain and create my vertex and index buffers in code.

Why must the number of elements in an array come before the actual array?

The reason for this is when I read the data from this mesh, say, I need to know how many vertices I need to read, or more precisely, how many bytes the list of vertices occupies in this file. The number lets me know exactly how many bytes by multiplying the count into the size of the sVertex struct(which is a total of 12 bytes). In this format, you can see the first four bytes tell me how many vertices there are(04 00 00 00 => 4), which is 4. Now I read the next 48 bytes as my list of vertices, since now I know the vertex count. Same goes for the indices as well.

Advantages of binary format:

  • Faster loading of data during runtime. Converting floats or 32-bit integers to bytes takes time. Maybe not a lot, but we will have more vertices to deal with once we start reading data supplied from the Maya exporter which we will eventually make this semester, and the conversion time will become more significant.
  • The size of the binary file is much smaller as opposed to a human-readable format. This is particularly useful during shipping games, as game data has to occupy a limited amount of space on DVDs, and binary format. This also applies for Steam games, which allows the user to download games faster and the games occupy less space(Yay for me personally, since I can have more games on my system).
  • This advantage might not performance or memory-related, but there is also no reason to make these files human-readable, since a player is almost never going to look at these files, much less modify them. Although, I have seen some cases where some config files have had text formats, I personally never modify them unless I have to(for fixing glitches or bugs in games, instead of waiting for patches).

Binary formats at run-time VS human-readable formats at build time

This should be clear from the third point I mentioned before, but keeping human-readable formats only makes sense when there are potential “humans” who will read and, more importantly, modify them. Which is why we keep the lua files human-readable so that they can be clearly understood and easily modified by developers(only me at this point) and designers. However, I wonder if we will ever devise a way where changing these Lua files will dynamically change the output in our game(the way Unity works where you can move objects even during play mode).

As for the size advantage, my Mesh.lua file is 359 bytes whereas my binary Mesh file is only 80 bytes(4 times less than the Lua file, a significant difference).

Whether the built binary mesh files should be the same or different for the different platforms, and why

I believe for this assignment, the binary mesh files can be the same, since both Direct3D and OpenGL accepts int values for the buffer sizes(well, Direct3D accepts unsigned int and OpenGL accepts int, but both are 4 bytes on the Windows architecture).

However, a warning from Microsoft – Data Type Ranges

The int and unsigned int types have a size of four bytes. However, portable code should not depend on the size of int because the language standard allows this to be implementation-specific.

Meaning that for different architectures might consider int as occupying different sizes(maybe consoles?), and so care should be taken when reading or writing the amount of bytes from these binary files, since you may end up reading/writing more or less than the intended data.

Extracting the four pieces of data from the binary file when loaded at run-time

During run-time, I need to read in the binary data to populate the four pieces of information I need to render a mesh: vertex count, list of vertices, indices count and the list of indices.

I do this using the following code snippet:


The first read essentially converts the char pointer(1 byte) to an int pointer(4 byte), meaning when I de-reference the pointer, I get the data occupying the first four bytes of the buffer pointer, which is the vertex count. The next set of bytes should give me the list of vertices, but the starting point to read from is specified by the vertexCount(or more accurately, the size of the vertex count, which is 4 bytes). Now, if the list of vertices had come before the vertex count, I would not have known where to read the vertex count from, since I dont know the size of the list of vertices. Same goes for the indices as well. Since i know the number of vertices, I can easily offset the data bytes occupied by the vertex count and the list of vertices to get to the index data.

Once I obtain these four pieces of data, I do the same as before; create vertex and index buffers to be used by the Graphics library for rendering.

The second(and the easier part) was to encapsulate the vertex and fragment shader into an “Effect” class. I had to essentially have two platfom-independent interface functions accepting the path of the fragment and the vertex shader to create the platform-specific API objects, namely the IDirect3DVertexShader9 and the IDirect3DPixelShader9 objects for Direct3D and the GLuint programID for OpenGL. I made these objects as private members of my Effect class and added two public platform-independent functions; CreateEffect – for creating and loading shaders, and SetEffect for binding the shaders to the device(or program for OpenGL). In addition to this, I also have a platform-independent destructor, which will cleanup the platform-specific graphics objects once we are done with them.

My Effect class:


(apologies if it is small; clicking on it should make it look clearer)

Each of these functions have platform-specific implementations, which was just copied code from the Graphics library.

Code to load the effect:


where s_Effect is the global effect, and the CreateEffect takes the path of the fragment and vertex shader as arguments to load the respective shaders. This code is the same for both Direct3D and OpenGL versions, since I believe the goal for this assignment is to make the Graphics as much as platform-independent as possible(maybe completely; that would be cool).

Code to set the effect:


This code segment is also the same for both versions(I say segment because the code preceding this is still different where the image buffer is cleared).

I hope at the end of this semester we make the Graphics source to be completely platform-independent, though that seems to be an ideal goal at this point. But even so, there seems to be very little code that is platform-specific at this point, so this should be easy enough to achieve, so fingers crossed 😉

The output for this assignment is the same as last time:


Link to the build:


That’s all for this week. Following week is fall break, so no assignments( I hope, or not, since I enjoy doing these assignments)…

Stay tuned for more after fall break…

Assignment 05 – Data-driven asset building

This week I will be modifying the AssetBuilder to build assets from a Lua file instead of providing each asset in the command line. This will make it easy to add assets later on, since there WILL be more assets(more meshes, maybe more shaders?) that will be added to the game.

To do this, the existing BuildAssets lua file will need to be modified to directly read the Assets lua file, AssetsToBuild.lua. So the logic that was used to read the Mesh lua file will be “copied” over to the BuildAssets lua file. I have to say, compared to the C++ logic of reading tables, this was easier AND shorter to code in Lua, since there is no need to maintain and understand the position of the Lua state on the stack. It is now simply two ‘for’ loops, one to read the outer dictionary of assets, and the second to read the asset list within each asset type.

My AssetsToBuild.lua format, How it works and why I designed it the way that I did


For my AssetsToBuild Lua file, my parent table is a dictionary of asset types. For now, the two keys are “meshes” and “shaders” since these are the two asset types we are using currently. Each asset type has four “attributes”:

  • The buildtool which will tell the asset builder which tool to use the assets belonging to this type.
  • assets listing all the assets of this type.
  • source specifying the relative path within the Assets folder.
  • destination specifying the relative path within the data folder.

Notice that the source and the destination attribute is currently blank. This is because we are simply copying all the assets from the Assets folder to the data folder. However, I have a feeling(could be wrong!!!) that once the asset count increases to a big number, we might decide to group the assets to different folders, and these attributes will help out in the copy process.

The reason behind the current design of this Lua file is I wanted to choose a very simple, human-readable format so that a simple look at the file should give the user all the information he/she needs to understand/modify the data inside this file. In this design, it is very easy to add assets(simple add to the “assets” list), or add a new type(copy either the “meshes” or “shaders” structure and replace the values with the new ones).

Where I added AssetsToBuild.lua in Solution Explorer and why

The reason for this is the same as for BuildAssets.lua in my second assignment: I believe that all relevant files used by any project should be under the project’s directory in Visual Studio. VS does agree with me on this with regard to .cpp and .h header files, and I think the same policy should be followed for any other files used by the project, regardless of when they are used(compile-time or run-time).

The next part of this assignment was to add additional builders for the assets AND modifying the existing AssetBuilder so that it will use the new BuildAssets lua file, which in turn will use these builders to “build” these assets. So now, under the Tools folder, I now have four projects:


  • AssetBuilder which executes the BuildAssets lua file by providing the AssetsToBuild lua file as a command-line argument.
  • BuilderHelper which is essentially an interface providing a Build function that can be inherited by potential builders to “build” the assets according to each type.
  • GenericBuilder which simply copies each asset from source to destination.
  • MeshBuilder performs the same function as the GenericBuilder(for now, check next week for more on this!)

Each of these builders will be called by the BuildAssets lua file based on which asset it is trying to build at the time.

The new custom build step for BuildAssets

Because of this change, the new command line has changed from

$(BinDir)AssetBuilder.exe” vertex.shader fragment.shader Mesh.lua



essentially removing all the command-line arguments, since all these details have been moved to the AssetsToBuild lua file.

Which projects AssetBuilder depends on and why

The AssetBuilder project is currently depending on the following projects:

  • Lua – to execute the BuildAssets lua file
  • Windows – for all the functions needed to build the asset(like CopyFile and CreateDirectoryIfExist)
  • BuilderHelper – since this project uses the utility functions provided by this library

Which projects BuildAssets depends on and why

The BuildAssets project just depends on the AssetBuilder project because of the custom build step mentioned above: the AssetBuilder project needs to be built for it to be executed by this project.

The final part of this assignment was to simply add a new triangle mesh and render it. Here is the final output for my assignment:Ass05Screenshot

Here is a screenshot of when I debug the MeshBuilder:


Link to the build:


Thats it for now, stay tuned for next week’s Assignment…