Monthly Archives: September 2014

Game Engineering 2 Assignment 04


W – Up; S – Down; A – Left; D – Right


The project has been updated to use constant tables to allow changing color of a mesh.  Constant tables were used to move a mesh around the screen.  The build asset pipeline was edited to allow for multiple builders for assets.


The aim of the fourth assignment was to introduce constant tables, incorporate code from previous engineering classes and modify the build pipeline to allow more than one builder for assets.


I put together a system for using constants within rendering under the instruction of John-Paul.  Constants are values passed into a shader that modify the output of said shader.  The two cases that constants were used were per material and per instance.  The specific per material case is a color shift to the mesh and are set every time a different material is applied to the mesh during the fragment shader step in the rendering pipeline.  The specific per instance case is a translation of the mesh based on screen coordinates during the vertex shader step in the rendering pipeline.

I added to the material file format from assignment 03.  I wanted to ensure there was room to grow with per material and per instance variables being separate.  I also wanted to maintain a human readable format.  The format I built was:

Shaders = {
Vertex = “data/vertexShader.hlsl”,
Fragment = “data/fragmentShader.hlsl”,
Constants = {
PerMaterial =
ColorModifier =
Red = 1.0,
Green = 1.0,
Blue = 1.0,

The values in ColorModifier are the real numbers between 0 and 1.  The reason for tables within tables was to ensure room to grow as more constants are added.  I am unsure how meshes will be added.  I assume meshes will have their own file format.  Therefore the values in the file are specific to materials.  By providing room for the file to grow I intend to save time as the file grows.  Further readability helps myself and others that use my files.

Screen Shots

I generated two screenshots of the project at this point.  The first screenshot uses RGB values of (1,1,0) and is yellow.  The second uses RGB values of (1,0.5,0.5) and is pink.  The first screenshot is in the default location.  The second has been displaced using the keyboard.  The displacement and coloring is done through constant tables provided to the shader.

Yellow rectangle in the default position with RGB values of (1,1,0).
Yellow rectangle in the default position with RGB values of (1,1,0).
Pink rectangle with RGB values of (1,0.5,0.5) and a displaced position all done through constant tables applied to the shaders associated with drawing the rectangle.
Pink rectangle with RGB values of (1,0.5,0.5) and a displaced position all done through constant tables applied to the shaders associated with drawing the rectangle.


I added code for actors, memory management, mathematics, data structures, controllers, script handlers for file IO and timers.  Adding code from prior engineering classes was a lengthy process.  I spent time working on namespaces within Engine.  There is still work to be done.

I added memory leak detection.  I found a number of memory leaks that I hadn’t discovered last year in my old code.  I will continue looking for memory leaks.

My controllers ended up working great.  I need to find a better way to handle input.  I am iterating over all keys to fill an array of bits.  I would like to be able to pull individual keys.

I added a single actor to my World in Engine.  I need a way to handle multiple actors.  I am excited to bring in old code, but there is an associated maintenance cost in time.

I will be making a distinct Mesh class for handling mesh related data.  Currently both are under the umbrella of Materials.  I will be revisting my Materials class to remove anything not related to materials.  Vertex information is likely unrelated to materials and probably needed in a mesh class.

My asset builders need attention.  I did little architecture work on the build tools.  BuilderHelper and GenericBuilder in particular need attention to at least be classes.


Updating my old code was time consuming.  I have spent more time on this homework than prior homework assignments excluding homework 1.  Game and Engine are currently not specific enough.  World is taking on more responsibility than it should. I didn’t have time to address that issue.

The eae6320 namespace still exist in my code.  I am going to be cleaning it up since it is ugly.  I spent too much time on old code and getting stuff fixed for submission to worry about it.

Next Steps

Implement the mesh class!  Fix input!  Obtain a better separation between game and engine.

Time Estimate

The workload for the project was heavy due to maintaining old code.  Overall the project shouldn’t have been too bad.  I suspect the time I spent on this assignment will be valuable to the future of the project however.

Activity Time Spent
Adding Constant Tables 3 hours
Implementing old code 5 hours
Adding New Build System 4 hours
Write up 1 hour
Total 13 hours

Game Engineering Assignment 03


The first part of the third assignment was to draw a rectangle on the screen.  The second part was to analyze the rectangle using PIX.  The third part was to make a file called a material composed of the location for shaders that describe that material.


The aims of the third assignment were to introduce index buffers, continue working with PIX for debugging and start to abstract rendering by removing hard-coded paths to materials.


Index Buffers

Adding the index buffer was relatively simple.  Index buffers are a method for drawing triangles that share vertices.  The approach reduces memory usage in the vertex buffer when there are shared vertices with the best case being a single vertex making up a brand new triangle since two of the other vertices are shared.

I developed a system to generate random rectangles.  The current shader pipeline needs a position and color.  The coordinate system for drawing in the current viewport is in normalized device coordinates.  That means that the center of the screen is (0,0).  The top left corner is (-1, 1).  Coordinates occupy the range ([-1 to 1], [-1 to 1]) with the first coordinate determining horizontal position and the second determining vertical position.

In order to provide a random rectangle I generate a coordinate for vertical and horizontal positions.  The bottom left coordinate is the negative vertical and horizontal values.  The top left coordinate has a negative horizontal value.  The top right coordinate has no negative values.  The bottom right coordinate has a negative vertical value.  The idea came from the way quadrants are divided in Cartesian coordinate systems about the origin (0,0).


The third assignment is the second assignment to use PIX.  PIX is useful for debugging DirectX graphics information drawn to the screen.  The new call used to draw the index buffer is DrawIndexedPrimitive().  The coordinates and color of each vertex is shown in the “Details” window under “Mesh” and “PreVS”.

The rectangle randomly generated using quadrant based coordinates.
The rectangle randomly generated using quadrant based coordinates.
Pix Data for the rectangle generated using quadrant based coordinates.  The call used was DrawIndexedPrimitive().  The index buffer is shown under Pre VS.
Pix Data for the rectangle generated using quadrant based coordinates. The call used was DrawIndexedPrimitive(). The index buffer is shown under Pre VS.  The table from left to right is VTX is the vertex number, IDX is the index number, position is the position provided to the vertex shader, diffuse is the color provided to the vertex shader.  VTX corresponds to the list of vertices.  IDX are the indices used for each vertex.  There are four IDX values and six VTX values.  Each IDX value has specified position and color.  VTX reuses indices from IDX to get the same position and color.

Material Asset

Building a material asset required developing a file format.  The information needed in the material asset is the path to the vertex and fragment shader.  I made the decision to name the file “material.lua” since the file is a Lua file and I am not worried about obscuring the data.  In professional projects obscuring the data might be useful.  I decided to use a human readable format to allow easier authoring of assets which is something Lua provides with a format similar to JSON.

My material.lua file has the following contents:

shaders = {
vertex = “data/vertexShader.hlsl”,
fragment = “data/fragmentShader.hlsl”,

In my rendering system I have put a method for breaking the table apart in place.  I used a system that iterates between shaders in the table and determines if the provided shader is useful to the system.  If the shader is specified and not used in a debug build the system will let the user know.

The reason for using a shader specification is that a material may be composed of more than shaders.  I want to keep that option open.  Materials can be thought of as wood or glass so translucency may be a property that a material needs.  Maybe materials have different physical properties for the physics system.

I added the material.lua file to my BuildAsset system.  The system will build the material.lua file along side the shader files.  I will be encapsulating the material system as the need becomes more pressing.


The work was primarily in reading source examples from John-Paul.  I am becoming more proficient in reading the documentation.  I chose not to implement a separate material class yet.  I realize that implementing the material class will help in the future and plan to implement the material class.

Next Steps

Implement the material class!


John-Paul’s documentation and source code proved invaluable.  I haven’t contributed since the first assignment.  There seem to be fewer questions from the class overall.

Time Estimate

The work load for the project felt like less than previous projects.  I do not mind having less to do.  I was surprised at how easy the project came together.

Activity Time Spent
Adding Index and Vertex Buffer with Randomized Vertices 2 hours
Analyzing with PIX 1 hours
Adding Lua to Graphics to Allow Changing Shaders 3 hours
Write up 1 hour
Total 7 hours

Game Engineering 2 Assignment 02


 The first part of the second assignment added color to the rendering pipeline.  The outcome is a colorful triangle.  The learning objective was to learn about the rendering pipeline particularly that the vertex shader passes information (i.e. color) to the pixel shader.

The second part of the second assignment used PIX to analyze pixel data.  The outcome was a screenshot of the colorful triangle and the color values of the triangle when DirectX draws the triangle.  The learning objective was to understand debugging frames from DirectX.

The third part of the second assignment added scripting to the tool that ensures assets needed by the game are built and in the correct directories.  The outcome was a Lua script that replicated behavior done in the prior assignment in C++.  The behavior copies files from a source to a target folder.  The learning objective was understanding the interface between C++ and Lua.


The aims of the second assignment were to understand that there is a rendering pipeline, debugging rendered frames using PIX and incorporating Lua into C++ projects.


Rendering System

The rendering system from assignment 01 drew a white triangle.  For assignment 02 the triangle was colored by adding color information to the struct passed to the rendering system.  In order to add color the layout of the memory being passed to the rendering system needs to be specified.  For the purposes of assignment 02 the layout was two floats for position and a 32-bit integer as a DWORD for color.  A float is 4 bytes on a 32-bit x86 system which means the offset for color was 8 bytes.

The entirety of that memory is passed into the rendering system starting with the vertex shader.  The vertex shader sets position and color outputs.  I am not entirely sure how the pipeline gets the output color information to the pixel shader since the vertex shader produces two outputs.  The pixel shader gets the color value from the vertex shader and sets the color for the pixel.  The color is linearly interpolated between pixels resulting in a gradient of color between the pixels that were drawn for that shape.  Assignment 02 is a triangle which means there are three vertices.  I populated the vertices with random numbers in the range of [0-255] using a call to D3DCOLOR_XRGB.  D3DCOLOR_XRGB takes integers in the range of [0-255] for red, green and blue and gets a representation as a D3DCOLOR value.

PIX is going to be a useful debugging tool.  PIX allows a DirectX program to run and frames to be sampled by pressing F12.  The frame can then be analyzed.  The DirectX calls are shown similar to a call stack.  Highlighting any of the calls shows information about that call.  For assignment 02 we looked at the mesh details for the DrawPrimitive call which is responsible for drawing the triangle.

Triangle from PIX
Triangle from PIX  – I generated a lot of random triangles.  I selected this triangle.  I analyzed the triangle with PIX

PIX data for the triangle I selected.  In the events region I have selected DrawPrimitive.  The details are related to that call since I have it highlighted.  I am looking at the mesh details.  There are three rectangular boxes with triangles in them that show the triangle before the vertex shader was applied, after the vertex shader was applied changing the position of the triangle and where the triangle was placed in the viewport (i.e. on the screen).  Underneath there are values which correspond with each vertex.  The position of each vertex and color are shown respectively.  The color numbers are alpha, red, green and blue.  Alpha for each vertex is full.  The colors for the other vertices in a hex and decimal format respectively are [0]->{(A3, 39, 15) or (163, 57, 21)}, [1]->{(59,F6,0E) or (89,246,14)} and [2]->{(6B,1A,AA) or (107,26,170)}.
PIX data for the triangle I selected. In the events region I have selected DrawPrimitive. The details are related to that call since I have it highlighted. I am looking at the mesh details. There are three rectangular boxes with triangles in them that show the triangle before the vertex shader was applied, after the vertex shader was applied changing the position of the triangle and where the triangle was placed in the viewport (i.e. on the screen). Underneath there are values which correspond with each vertex. The position of each vertex and color are shown respectively. The color numbers are alpha, red, green and blue. Alpha for each vertex is full. The colors for the other vertices in a hex and decimal format respectively are [0]->{(A3, 39, 15) or (163, 57, 21)}, [1]->{(59,F6,0E) or (89,246,14)} and [2]->{(6B,1A,AA) or (107,26,170)}.
Scripting for Building Assets

Incorporating Lua scripting into the tool to build assets involved adding a project for Lua.  Lua was compiled to a library and an include file for C++ was added to the tool for building assets.  My implementation of the tool for building assets required using private static functions.  Using C++ functions from Lua requires non-member or static functions.

Understanding the Lua stack was an interesting process.  Lua is dynamically typed and can return multiple values.  C++ is strongly typed and can only return a single value.  A stack is used by Lua to inferface with C++ in a way that supports getting proper types and returning multiple values.  Error handling can occur in either Lua or C++.


The addition of a library and script directory is helping to keep code more organized.  There are now two libraries compiled in my project.  I have lua.lib and Engine.lib.  They both go to a special folder.  Scripts have their own folder which helps keep everything organized.  I made the root of my project my solution directory removing an intermediary directory.


I had a hard time understanding the Lua states being handed to functions in C++.  I figured out that the Lua states were passed like the this pointer in C++ from a Lua call.  The parameters are passed on the stack, so in C++ the user must pop the values off to use them.

Next Steps

I need to refine the Lua to C++ and vice versa interface.  I would also like to remove the BuildAsset project.  I would like to get rid of specifying assets.


I was isolated on the project.  I was unable to take time to answer e-mails or ask questions this week.  John-Paul provided a lot of the code in samples and instructions for assembly.

Time Estimate

I will be setting a timer in the future.  I have estimated the time required for finishing the project.  Working with Lua took the bulk of the time.

Activity Time Spent
Adding Color to Rendering Pipeline 2 hours
Analyzing with PIX 1 hours
Adding Lua to Build Assets 6 hours
Debugging 2 hours
Write up 1 hour
Total 12 hours

Showing Off Wall Walking

My team showed off Hostile Territory to Bob, Jose and Ryan. The feedback was great and has made our team discuss the game we are making. There is general excitement about wall walking.

I am working on the execution of wall walking. I am building a system with Ron Romero where the camera and controls can completely change. I am making wall walking part of that system.

I am trying not to influence design decisions too much. I am trying to keep myself to a role of being a decent engineer. I need to be a better team player when I am not the designer on a project. That means providing insight when it is needed and backing off otherwise.

Rotations and Wall Walking

I had decided today would be a good day to unravel the spaghetti surrounding the player controller in Hostile Territory’s source code. It turns out I wasn’t very interested in getting that fixed yet. Instead I spent my time figuring out the math to prevent the player from feeling like their rotations has changed when they move onto another surface wall walking.

In layman’s terms our shooter now has surface walking. That means the player could be on any surface in the game. If I were directing design, which I am not, I would allow the player to spin while in the air changing their gravity.

Overall the engineering for wall walking has been an amazing experience. I was not on the team from the beginning, but I watched the team get asked if people can run up the walls. Now they can.

More Walking Up The Walls

Today I improved the wall walking using a suggestion from Rob Guest. The calculation was being done with -Vector3.up which is world up. Instead I am now using -transform.up which is the bottom of the game object. Overall the approach makes sense. I remember it didn’t work before, but it works beautifully now. The end result is that the player can move up any surface so long as the game objects bottom is facing it.

I also submitted my character controller to source control. The controller is high enough quality to submit. It will take a lot of work to get the controller into the game. Turns out a lot of the code from before is using the character controller we are removing. Modularity will be important moving forward.

I am going to figure out YouTube and upload videos there. No video today.

Walking Up The Walls

I have been working on a movement controller for Hostile Territory. Ron Romero is working on the camera and a different movement controller. I have been attempting to build my controller in such a way that the player can walk up walls. Hostile Territory is currently inside a cylinder. People ask if you can run up the walls of the cylinder.

I have a demo of walking up walls and jumping on the surface. The demo uses a raycast from the player to get the normal of the surface underneath them. The jump is oriented along the normal. Gravity is in the direction of the normal for the player. I do not own the music in the video. I didn’t realize the sound was being recorded. The artist is Pendulum, but I am not sure which song it is.

unity 2014-09-08 14-42-47-90


Game Engineering 2 Assignment 01


Asset tool chains are an important part of getting art and graphics related features into games.  With written directions and source code from John-Paul, the Game Engineering 2 instructor, I was able to build a system that moves specified files to the data directory of the compiled game.  In order to perform the task I had to change Visual Studio related settings including projects, property sheets, macros and libraries.  The source code provided by John-Paul was purposefully built with no inherent architecture.  I built an architecture separating tools, game and engine.  My architecture has also been built with considerations of portability encapsulating Windows and DirectX systems within the world system in my engine.


The aims of my first project were to build a framework for tools, game and engine.

Architecture Details


The tools system is currently an asset build tool that ensures the assets such as high level shader language (HLSL) files are accessible to the game executable in a clean package.  I have left the source files provided by John-Paul largely unchanged with the exception of moving the files to an asset builder class.

There was name clashing between Windows functions and the names provided, so I added FromWindows to the names.  In the future the names need to be changed in a better way, but without knowing the next step I am content to leave them with similar names to prevent confusion.  I will be vigilant to the potential problems I have expressed as the project moves forward.


The game system is responsible for running the game loop.  In the future the game system may become a number of classes.  The game presently creates and initializes the world from the engine and then runs the game loop by calling update on the world.

The game can’t actually modify anything in the engine yet.  In the future I will need a way for the game to provide input to the engine that changes what is shown on screen.  A possibility for the current project was to have the game provide information about the white triangle, but that seemed like over-engineering and could end up hurting me as we continue to develop the systems.


The engine system is useful as a way to abstract details that gameplay programmers should not need to know.  As I was architecting the systems I recognized that creating a Window and graphics system is not part of the game.  I made a design decision that the Windows and graphics systems should be hidden from the gameplay programmers through a wrapper.  I called the wrapper the world system.

The world system knows about the current frame in the game.  The game includes the world and accesses features of the engine such as creating Windows and graphics systems through initialization, update and shutdown functions.  In the future to consider having multiple small classes allowing access to engine features, so the user of the engine can include smaller amounts of functionality and the world class doesn’t get bloated.  I am also concerned about the project being portable which means I need a way to ensure the user is only getting what is required for their operating system.


When I started the project I was unsure regarding the separation between getting the project running and architecture.  Now I believe they were blended for the assignment.  There was a lot of reading and I had to double check myself to ensure that I didn’t miss requirements.

I had the project running as explained in John-Paul’s directions before starting architecture work.  I drew my architecture out on paper before starting the architecture related work.  The basic idea is similar to the architecture used in Game Engineering 1 with the addition of tools.

I had an issue involving closing the window with the red X.  The object would be destroyed and then I would call DestroyWindow on the already destroyed object.  The callback function for Windows messages was static, but my handle for the Window was not static and therefore could not be accessed in the callback function.  I was worried about having a leaked Windows handle.  After an e-mail from John-Paul and testing for leaks I found that instead of a leaked Windows handle I had a stale pointer.  I changed my handle to static and when the X is pressed the handle which is now a stale pointer is set to NULL before being destroyed.

I also had an issue with linking DirectX libraries in a static library and then using that library.  Libraries should be included at the application level and not in static libraries.  Including libraries in a static library and then using the library causes a duplication linker warning.

Next Steps

I am pleased with the architecture I have been building.  I would like to start merging code from Game Engineering 1 and AI into the system.  I would also like to start digging deep outside of class regarding engines, graphics and AI.


John-Paul was helpful on the project helping solve the Windows handle issue and the linker error for including static libraries in static libraries.

Ron Romero commiserated on deleting the Windows handle more than once which helped me continue trying to find the source of the linker error.

Jamie King helped find a logic issue in my shutdown functions.  I was using “XOR” to evaluate two conditions.  If both were true then the result would be false.  I meant for the logic to be “AND” so that if either condition were false then there was a problem shutting down.

Time Estimate

I am not used to tracking my time, but will be doing so officially in the future.  I like the idea of associating time with commits.  Having said that my first breakdown of time is an estimate.

Activity Time Spent
Initial Working Build 6 hours
Pre-planning Architecture 1 hour
Building Architecture 6 hours
Squashing Bugs 2 hours
Write up 1 hour
Total 16 hours

Character Controller and Sticking to Surfaces

Today I did work on rotating an actor to be normal to a collided surface.  The project was an experiment.  I have a partially working version of the proper rotation to be normal to the surface that the actor last collided with.  A more pressing issue however is getting jumping working for the character on a standard flat surface.

I spent the last portion of my day working on jumping.  I plan to get jumping finished for the character on Monday.  Ron and I will combine his camera work and my movement work into two or three player controllers to find a good fit to meet the needs of Hostile Territory.  Hostile Territory needs a controller that feels good platforming and shooting.

Character Controller and Networking

I have been assigned to handle the character controller.  I want to get the character controller feeling correct.  Our thesis game is currently using a third person view which makes aiming the camera for shooting difficult.  I am going to try to pull off a good third person shooter experience.  Third person shooters are generally a mistake, but if I can pull it off well then we are rewriting the rules that say third person shooters are a mistake.

I am also working on making sure the controller works over a network.  I will need to ensure that movement and projectiles work over the network.  I will also need to work on the animations over a network.  I need to get a virtual box or set Unity to run in the background.  I will look for that option now.

I am enjoying being part of the Hostile Territory thesis team.  I am avoiding the urge to push design changes since there is a lot of engineering work that needs to be done.  Engineers need side projects where they serve in a design function to stem the urge to push design decisions.  I am happy that I have Roller and my engine to keep me from worrying about design decisions.

If I were to redesign Hostile Territory shooting would be removed.  The game would still be third person.  The grid would start unclaimed.  Players would be able to run over the tiles to claim them.  The game would have a time limit.  The player that claimed the most territory at the end of the time limit is declared the winner.  Boxing in regions claims the interior territory.  The floating platforms would not be in the game.