Category Archives: Game Engineering 2

Game Engineering 2 Assignment 13

Controls

Left and right arrow keys move left and right. Forward arrow starts the car moving. Back stops the car.

The ‘1’ key toggles debug. Press it quickly to see Christmas lights! Create your own light show!

"Almost Christmas" is a game about final projects, tedium and long nights.  The game has a Christmas theme and is a tunnel where players can't reach the end, but they can enjoy the light show by pressing the '1' key rhythmically.  In the end it is "Almost Christmas".
“Almost Christmas” is a game about final projects, tedium and long nights. In the end it is “Almost Christmas”.

Abstract

The scope of assignment 13 was a to make a game with the engine. The game I built is called “Almost Christmas”. “Almost Christmas” is a game about not being able to reach Christmas break. The experience is meant to inspire tedium in the player. The player is presented with an artificial choice of left and right. Moving forward moves the player closer to christmas, but before the player reaches Christmas they are warped back to the beginning. Here’s a picture of the game

Aims

My aim for the assignment was to built a more automatic asset process. I also wanted to update my scene file format. Both goals were met. The game was supposed to be an AI racing game where the player and an AI opponent navigate a grid and whoever reaches the goal first wins. With the asset and scene modifications I wanted to make the scope of the game was too large. I decided to make a game about my current emotional state. I long for Christmas. Capturing that in a game led to “Almost Christmas”, a game about the tedium of trying to reach Christmas break and getting set back.

Implementation

The bulk of my implementation time was spent on making the reloader. The reloader is a process that runs in the background and monitors the assets folder. When a change is made to the assets folder the asset that changed is determined. That asset is added to a list which is piped into the asset pipeline. The asset build process is ran and builds the edited assets. Here are examples:

The Reloader is loaded.  I have not made any modifications to files yet.
The Reloader is loaded. I have not made any modifications to files yet.
The Reloader is up and I have made edits to a scene, mesh, material, shader and texture file.  There are even errors that are being printed due to copying and pasting a fragment shader and then renaming it to be a vertex shader.  The not found error is due to the Reloader copying the copy file from Windows into AssetsToBuild.  Definitely some kinks in the reloader, but I do not need to touch AssetsToBuild very frequently.
The Reloader is up and I have made edits to a scene, mesh, material, shader and texture file. There are even errors that are being printed due to copying and pasting a fragment shader and then renaming it to be a vertex shader. The not found error is due to the Reloader copying the copy file from Windows into AssetsToBuild. Definitely some kinks in the reloader, but I do not need to touch AssetsToBuild very frequently.

The 2D UI element is the level. If another scene were made that scene would have the number 1 on it. I initially planned to have multiple racing maps, but had to make some concessions.

Overall, the asset builder has been the best tool from the class. I have been excited to expand the capabilities. My Reloader reduces the number of times I have to manually edit AssetsToBuild and execute AssetBuilder. Paired with a scene system that uses key value pairs and controllers I do not need to touch actual source code as frequently as I otherwise would allowing my engine to be more generic.

Issues

I ran into time constraints which required me to reduce the scope of my game. Building better tools took most of the time allotted for the assignment. I am happy with my choice to build better tools. I am going to continue to extend my tools, but tool development takes a lot of time.

I would like to thank JP for explaining asset build processes. I am going to refine the process since it saves time. Sophisticated tools take time to build, but save time if used with enough frequency.

Time Estimate

The workload for the project was about 50 hours plus or minus 15. I spent a lot of time on tools. The game was maybe a 4 hour endeavor after building tools.  I am tired.  I am going to sleep and enjoy Christmas.

Activity Time Spent
Reloader Tool 20 hours
Editting Scene Format 20 hours
Scoping the Game and Freaking Out 4 hours
Building the Game 4 hours
Write up 2 hour
Total 50 hours

Game Engineering 2 Assignment 11

Controls

Atlas – [0-9] on the keyboard will show an image for [0-9].

Light: T – Positive Z; G – Negative Z; F – Negative X; H – Positive X; R – Add Red Light; V – Subtract Red Light; Y – Add Green Light; N – Subtract Green Light;

Camera: I – Positive Z; K – Negative Z; J – Negative X; L – Positive X; U – Positive Y; M – Negative Y; Insert – Rotate in Positive Z; Delete – Rotate in Negative Z; Right Arrow – Rotate in Positive Y; Left Arrow – Rotate in Negative Y; Up Arrow – Rotate in Positive X; Down Arrow – Rotate in Negative X

Objects: WSAD – Move Object, Up and Down is q and z.

Abstract

The scope of assignment 11 was to add resolution independent sprites with a different rendering mode to the game. The shaders involved needed to allow atlasing. In addition to changing rendering modes, changing sampling state was introduced.

Aims

The aim of the eleventh assignment was to implement resolution independent sprites, mixing indexed and not indexed drawing and understanding render states.

Implementation

My implementation started with improvements to the scene file. I am working toward an architecture where the user can leave data out of the scene to initialize data to default values. A case where the approach is most noticeable is determining keys that move an object. In addition I have added handling of input and movement controller in the scene file. The scene file currently contains all the actors in the game.

The scene file is being shown.  Each actor is surrounded in curly braces.  The type is specified and handled by the World at run-time when creating in game objects for the scene.  The name is the actor name in the game.  Since I presently only allow one light source the name Light is fitting.  The location is a vector in the XZ plane from the origin.  The rotation for a light is currently being used to determine color. The new part is the controller.  The movement part links to a controller in game that only does translation by a fixed velocity.  The input controller determines the action taken for input. The controls have changed so that the user should only define the directions that make sense.  For direction of the light Y doesn't make sense since the displacement in the XZ plane is the vector that is shifting the light.  That means that Y has been excluded from the movement.
The scene file is being shown. Each actor is surrounded in curly braces. The type is specified and handled by the World at run-time when creating in game objects for the scene. The name is the actor name in the game. Since I presently only allow one light source the name Light is fitting. The location is a vector in the XZ plane from the origin. The rotation for a light is currently being used to determine color.
The new part is the controller. The movement part links to a controller in game that only does translation by a fixed velocity. The input controller determines the action taken for input.
The controls have changed so that the user should only define the directions that make sense. For direction of the light Y doesn’t make sense since the displacement in the XZ plane is the vector that is shifting the light. That means that Y has been excluded from the movement.

I have defined sprites that render in screen space. I have defined a way to add sprites to the game through the scene file. The current implementation is less generic than I would like, so I will be tuning the implementation.

Sprites can be defined in the scene file.  Two sprite definitions are shown.  The first sprite is the logo and used the logo material.  The logo material has a texture that is the EAE 6320 logo.  The controller for the sprite is not defined.  When the scene file if there is no data for the controller the NoMove controller is given to the actor preventing motion during the update loop. To show that it does not need to be declared at all the second actor does not have a controller block.  The second sprite is a sprite sheet.  Presently to show individual numbers the image is atlased by an internal controller.  I will be adding a way to reference the controller from the scene file and provide specific UV coordinates on a per button basis in the future. I specify Top, Bottom, Left and Right.  As I was implementing sprites the idea of a top and left coordinate with a scaling factor came to mind.  I am currently only using the top and left coordinates and will be updating the sprite data.  Scaling the texture is not currently in the game.   Scaling is useful in my implementation because I am maintaining the size of the original texture currently.  With a scaling factor UI programmers could re-size the texture  in a data driven way. An interesting design specification of sprites is that they are resolution independent and hold position on the screen.  Having been a UI programmer I understand the importance of keeping the image not distorted and in the correct position.  A potential pitfall to my method is that the area of the screen taken by the sprite depends on resolution.  Adding flexibility for area may be a future iteration.
Sprites can be defined in the scene file. Two sprite definitions are shown. The first sprite is the logo and used the logo material. The logo material has a texture that is the EAE 6320 logo. The controller for the sprite is not defined. When the scene file if there is no data for the controller the NoMove controller is given to the actor preventing motion during the update loop.
To show that it does not need to be declared at all the second actor does not have a controller block. The second sprite is a sprite sheet. Presently to show individual numbers the image is atlased by an internal controller. I will be adding a way to reference the controller from the scene file and provide specific UV coordinates on a per button basis in the future.
I specify Top, Bottom, Left and Right. As I was implementing sprites the idea of a top and left coordinate with a scaling factor came to mind. I am currently only using the top and left coordinates and will be updating the sprite data. Scaling the texture is not currently in the game.
Scaling is useful in my implementation because I am maintaining the size of the original texture currently. With a scaling factor UI programmers could re-size the texture in a data driven way.
An interesting design specification of sprites is that they are resolution independent and hold position on the screen. Having been a UI programmer I understand the importance of keeping the image not distorted and in the correct position. A potential pitfall to my method is that the area of the screen taken by the sprite depends on resolution. Adding flexibility for area may be a future iteration.

I added new events for PIX. PIX has proven to be a fine way to debug DirectX related calls. I have been using it for all sorts of issues and feel comfortable with the features I have used.

PIX events have been created for both drawing meshes and drawing sprites.  The render state is modified based on whether meshes or sprites are being drawn.
PIX events have been created for both drawing meshes and drawing sprites. The render state is modified based on whether meshes or sprites are being drawn.
The sprites in the game are resolution independent.  The top left window is an 800 by 600 window.  The bottom window is 1600 by 600.  The leftmost window is 800 by 1600.  I didn't line them up pixel perfect, but the size is the same.
The sprites in the game are resolution independent. The top left window is an 800 by 600 window. The bottom window is 1600 by 600. The leftmost window is 800 by 1600. I didn’t line them up pixel perfect, but the size is the same.
The sprite 1 in blue on the top left side of the screen is being shown when the 1 button on the keyboard is pressed.  The 1 shown on the screen is from a sprite sheet.  The sheet has the numbers [0-9] on it.
The sprite 1 in blue on the top left side of the screen is being shown when the 1 button on the keyboard is pressed. The 1 shown on the screen is from a sprite sheet. The sheet has the numbers [0-9] on it.
The same sprite is being used, but now the texture UV's have changed to show the 6.  The 6 key is being held down.
The same sprite is being used, but now the texture UV’s have changed to show the 6. The 6 key is being held down.

Issues

I have traced the issues with my release build to the view to screen matrix using PIX. I am getting closer to resolving the issue. I have screenshots detailing the issue.

The PIX screenshot shows the information provided to the vertex shader for a cube.  The values for position are well defined.
The PIX screenshot shows the information provided to the vertex shader for a cube. The values for position are well defined.
After the vertex shader runs the values for x and y coordinates are huge.
After the vertex shader runs the values for x and y coordinates are huge.
The view to screen matrix is in register c4 to c7.  The numbers in release are not well defined.  In debug the values are between -10 and 10.  I have been looking into the creation of this matrix and will be creating a log file to figure out how to fix the issue.
The view to screen matrix is in register c4 to c7. The numbers in release are not well defined. In debug the values are between -10 and 10. I have been looking into the creation of this matrix and will be creating a log file to figure out how to fix the issue.

I also had an issue with my sprite fragment shader. In order to add alpha information the shader needs to output four floats. I was only outputting three floats. The result was a messy image. Detection of the issue involved loading the .dds file into Visual Studio and removing channels until the image matched. When the image looked the same I knew which channels I was missing in the fragment shader.

I also defined the triangles of a sprite wrong such that the triangles were overlapping with a wedge of black on the left side. The fix was easy to figure out from PIX debugging.

Time Estimate

The workload for the project was about 14 hours.

Activity Time Spent
Updating Scene File 4 hours
Implementing Sprites 4 hours
Tracking Down Bugs 4 hours
Write up 2 hour
Total 14 hours

Game Engineering 2 Assignment 10

Controls

Light: T – Positive Z; G – Negative Z; F – Negative X; H – Positive X; R – Add Red Light; V – Subtract Red Light; Y – Add Green Light; N – Subtract Green Light;

Camera:  I – Positive Y; K – Negative Y; J – Negative X; L – Positive X; U – Positive Z; M – Negative Z; Insert – Rotate in Positive X; Delete – Rotate in Negative X; Right Arrow – Rotate in Positive Y; Left Arrow – Rotate in Negative Y; Up Arrow – Rotate in Positive Z; Down Arrow – Rotate in Negative Z

Objects: WSAD – Move Object

Abstract

The project has been to add lighting to the game by means of modifying the fragment shader. The type of lighting added is called diffuse. Diffuse lighting treats lighting as a unidirectional field applied to each vertex of the mesh.

The normal to the vertex is interpolated and sent to the fragment shader where color is determined using the cosine of the angle. The cosine is obtained through the dot product since the normal vector and the diffuse lighting vector are both of unit length and the dot product is the magnitude of two vectors multiplied by the cosine of the angle between them. Light color is applied directly to that cosine angle creating a shaded look. An ambient light is also applied to the pixel to prevent the scene from being entirely dark.

Aims

The aim of the tenth assignment was to implement diffuse lighting and familiarize with PIX pixel debugging.

Implementation

My implementation required updating scene files for lights.  I had lingering bugs from the prior version that also needed squishing.  I am nearly done with bugs, but there is a rendering bug in release where objects are not being shown.  I will be investigating the issue using output to a log file.  The application no longer crashes while in release.  That issue was related to not building assets and not checking to see if there was a scene file.

I made some improvements to the scene file.  The light type has been added to the scene file.  I also fixed issues with order of the actors.  If non-renderable actors were placed before renderable actors the indexing was off for scene files.  I have fixed the indexing issue.

There have been improvements to the scene file.  A new type is light which indicates a light source in the game.  I am still uncertain how to add more than one camera or light source, but the format allows that expansion.
There have been improvements to the scene file. A new type is light which indicates a light source in the game. I am still uncertain how to add more than one camera or light source, but the format allows that expansion.

Ambient light is currently not an actor in the scene file.  In fact the ambient light has been hard-coded.  I have a screenshot with line number information included that does not give away any code.  The line number is 209 in Graphics.cpp.

The variable names for ambient light should remain the same until after grading.  I am going to change ambient light to be an actor in the scene file.  I am considering more separation between camera, lights and actors.
The variable names for ambient light should remain the same until after grading. I am going to change ambient light to be an actor in the scene file. I am considering more separation in the scene file between camera, lights and actors.

The project is currently being rendered better.  I changed the indices output from Maya from 1,2,3 to 1,3,2.  The shapes now look rounded with the addition of light.

The rabbit texture mapped wonderfully to the sphere in the scene.  The target texture did not map as well to the cube.  This is the first time I have used Maya for a project, so I am still learning how to map textures correctly.  The texture mapping may be the reason the floor is not lit well, but if I change the ambient color to be more red, green or blue the white of the floor shows prominently.
The rabbit texture mapped wonderfully to the sphere in the scene. The target texture did not map as well to the cube. This is the first time I have used Maya for a project, so I am still learning how to map textures correctly. The texture mapping may be the reason the floor is not lit well, but if I change the ambient color to be more red, green or blue the white of the floor shows prominently.

I added much more information to the mesh.  The data I currently have for each vertex is the XYZ coordinates, UV coordinates for texturing, normal XYZ coordinates for lighting, tangent and bitangent XYZ  values for curvature related functions and the coloring information for RGBA values.

The new mesh format includes a lot more information.  The vertex definition starts on line 5 and ends at line 39.
The new mesh format includes a lot more information. The vertex definition starts on line 5 and ends at line 39.

PIX is worthwhile for debugging.  I was able to debug a pixel.  The pixel information is shown below.

This is the rendering of the sphere with the rabbit texture.  I right clicked a gray colored pixel inside the rabbits ear.  The pixel data is shown in the next screenshot.
This is the rendering of the sphere with the rabbit texture. I right clicked a gray colored pixel inside the rabbits ear. The pixel data is shown in the next screenshot.
This is the shader that showed up when debugging the pixel in the rabbits ear.  The shader is being shown allowing debugging of the values by variable or register.  This is a debug version of the game.  I do not think PIX works with a release build of the game presently since we have not enabled the correct debugging flags for DirectX.  The symbols would not exist.
This is the shader that showed up when debugging the pixel in the rabbits ear. The shader is being shown allowing debugging of the values by variable or register. This is a debug version of the game. I do not think PIX works with a release build of the game presently since we have not enabled the correct debugging flags for DirectX. The symbols would not exist.

Issues

I am still having trouble with the release build.  The image is black.  I am going to dump information to a log file to track down the issue.

I have also been trying to figure out the math behind some of the work we have been doing for the project.  I will need to read text on the matter to really understand the math.  My understanding presently is that we are operating in a projected space.  I am excited to figure out more about the math behind graphics.

Time Estimate

The workload for the project was about 12 hours.

Activity Time Spent
Tracking down bugs from assignment 09 3 hours
Implementing lighting 4 hours
Adding lighting to the rest of the systems i.e. scene 4 hours
Write up 1 hour
Total 12 hours

Game Engineering 2 Assignment 08

Controls

Box: W – Positive Z; S – Negative Z; A – Negative X; D – Positive X; Q – Positive Y; Z – Negative Y; R – Rotate in Positive X; H – Rotate in Positive Y; T – Rotate in Positive Z; V – Rotate in Negative X; F – Rotate In Negative Y; G – Rotate in Negative Z

Camera:  I – Positive Y; K – Negative Y; J – Negative X; L – Positive X; U – Positive Z; M – Negative Z; Insert – Rotate in Positive X; Delete – Rotate in Negative X; Right Arrow – Rotate in Positive Y; Left Arrow – Rotate in Negative Y; Up Arrow – Rotate in Positive Z; Down Arrow – Rotate in Negative Z

Abstract

The project has been updated with a texture builder. The new builder facilitates placing textures into the game. Textures are images that are wrapped around a mesh using UV coordinates. U and V were chosen because X, Y and Z were taken. U and V serve as coordinates to map a two dimensional image to a three dimensional mesh. The idea is akin to melting shrinky dinks onto an egg for Easter.

Aims

The aim of the eight assignment was to build a way to get textures into the game. The result is a system that allows multiple meshes, materials and textures to be loaded into the game. The process is mostly automatic which simplifies adding art to the game.

Implementation

I started by adding the texture builder project to my tools. I modified my AssetsToBuild script to handle the new texture builder. Here is the new layout of the AssetsToBuild file.

–[[
This file lists every asset to build
]]

return
{
— Shader Programs
{
builder = “ShaderBuilder.exe”,
assets =
{
— Vertex shader programs
{
name = “vertexShader”,
extensions =
{
source = “hlsl”,
target = “shd”
},
arguments = { “vertex” },
},
— Fragment shader programs
{
name = “fragmentShader”,
extensions =
{
source = “hlsl”,
target = “shd”
},
arguments = { “fragment” },
},
},
},
— Meshes
{
builder = “MeshBuilder.exe”,
assets =
{
{
name = “cube”,
extensions =
{
source = “mesh.lua”,
target = “mesh”,
},
arguments = {}
},
{
name = “floor”,
extensions =
{
source = “mesh.lua”,
target = “mesh”,
},
arguments = {}
},
},
},
{
builder = “TextureBuilder.exe”,
assets =
{
{
name = “Rabbit”,
extensions =
{
source = “png”,
target = “dds”,
},
arguments = {}
},
{
name = “Bricks”,
extensions =
{
source = “png”,
target = “dds”,
},
arguments = {}
},
},
},
— Generic Assets
— (That just get copied as-is rather than built)
{
builder = “GenericBuilder.exe”,
assets =
{
{
name = “bricks”,
extensions =
{
source = “material.lua”,
target = “material”,
},
arguments = {}
},
{
name = “rabbit”,
extensions =
{
source = “material.lua”,
target = “material”,
},
arguments = {}
},
},
},
}

I updated my material files. The new material file has specifications for textures. The texture and sampler are named explicitly. I would like to divide the material from texture, but am unsure how to do so at this time. This is the bricks material.

return
{
Shaders = {
Vertex = “data/vertexShader.shd”,
Fragment = “data/fragmentShader.shd”,
},
Texture = {
Path = “./data/Bricks.dds”,
Sampler = “g_color_sampler”,
},
Constants = {
PerMaterial =
{
ColorModifier =
{
Red = 1.0,
Green = 1.0,
Blue = 1.0,
}
}
},
}

I updated the mesh to include information about UV coordinates of the mapped texture. I also added information regarding the texture that is applied to the mesh. I would like to be able to divorce mesh and texture, but was unable to figure out a solution. Here is the floor mesh that has the bricks texture.

return
{
Material = {
Material = “./data/bricks.material”,
Texture = “./data/Bricks.dds”,
},
Vertices = {
{
Coordinates =
{
X = -10.0,
Y = -2.0,
Z = -10.0,
U = 0.0,
V = 1.0,
},
Color =
{
Red = 100.0,
Green = 100.0,
Blue = 100.0,
},
},
{
Coordinates =
{
X = 10.0,
Y = -2.0,
Z = -10.0,
U = 1.0,
V = 1.0,
},
Color =
{
Red = 100.0,
Green = 100.0,
Blue = 100.0,
},
},
{
Coordinates =
{
X = -10.0,
Y = -2.0,
Z = 10.0,
U = 0.0,
V = 0.0,
},
Color =
{
Red = 100.0,
Green = 100.0,
Blue = 100.0,
},
},
{
Coordinates =
{
X = 10.0,
Y = -2.0,
Z = 10.0,
U = 1.0,
V = 0.0,
},
Color =
{
Red = 100.0,
Green = 100.0,
Blue = 100.0,
},
},
},
Primitives = {
{
[0] = 0,
[1] = 2,
[2] = 3,
},
{
[0] = 0,
[1] = 3,
[2] = 1,
},
},
}

There were a lot of builder and file handler specific changes required for the new file formats. I had to ensure the new meshes, materials and textures were being loaded correctly. Textures are compiled by the TextureBuilder to the .dds format. Meshes are compiled to a binary format. Materials are copied as is to the data folder. The process is automated and helps get new assets into the game faster.

I have pictures of the new implementation. Here are two images. The first is with the cube facing the camera. The second is with the cube rotated. Both have the new textures. The brick texture turned out darker than expected. The rabbit texture worked well.

The rabbit and bricks have been applied to the cube and floor plane respectively.
The rabbit and bricks have been applied to the cube and floor plane respectively.
The cube has been rotated to show two sides with a rabbit reverse on the other side.  This result is due to the layout of UV coordinates.  The bottom is also visible, but the UV coordinates do not map to the bottom of the surface since I am only using eight vertices.  To map to the bottom there needs to be more vertices.
The cube has been rotated to show two sides with a rabbit reverse on the other side. This result is due to the layout of UV coordinates. The bottom is also visible, but the UV coordinates do not map to the bottom of the surface since I am only using eight vertices. To map to the bottom there needs to be more vertices.
This is a PIX capture of the SetTexture instruction.  The data at the pointer to the texture is being shown.  The texture is the rabbit texture being mapped to the cube.
This is a PIX capture of the SetTexture instruction. The data at the pointer to the texture is being shown. The texture is the rabbit texture being mapped to the cube.

Organization

I created a new texture class in the Engine. I made a new texture builder that compiles image files to .dds for import by the Engine. The data formats for mesh and material changed significantly. Overall the project was light on organization however.

Issues

I had the wrong path to textures and ended up with a black screen for awhile. I had to use PIX to figure out that the texture was being set to a NULL pointer. I was able to fix the problem due to using PIX.

Time Estimate

The workload took approximately 12 hours. I am excited with the work done on this assignment. Adding art through the pipeline will be much easier than what we are experiencing in projects. I look forward to the next assignment.

Activity Time Spent
Implementing the TextureBuilder 2 hours
Updating formats, file handlers and the engine 6 hours
Ensuring everything worked 3 hours
Write up 1 hour
Total 12 hours

Game Engineering 2 Assignment 07

Controls

Box: W – Positive Z; S – Negative Z; A – Negative X; D – Positive X; Q – Positive Y; Z – Negative Y; R – Rotate in Positive X; H – Rotate in Positive Y; T – Rotate in Positive Z; V – Rotate in Negative X; F – Rotate In Negative Y; G – Rotate in Negative Z

Camera:  I – Positive Y; K – Negative Y; J – Negative X; L – Positive X; U – Positive Z; M – Negative Z; Insert – Rotate in Positive X; Delete – Rotate in Negative X; Right Arrow – Rotate in Positive Y; Left Arrow – Rotate in Negative Y; Up Arrow – Rotate in Positive Z; Down Arrow – Rotate in Negative Z

Abstract

The project has been updated with a mesh builder. The new builder takes data from mesh files and turns that into a binary format that is quicker to load into the game. The game has a loader which will load any number of meshes into the game.

Aims

The aim of the seventh assignment was to create a builder for mesh data. The purpose is to avoid hard-coding of mesh related data. The project no longer creates random meshes.

Implementation

I started by modifying my AssetsToBuild.lua to specify extension for each item instead of for an individual builder. The reason for the change was that the system needs to be able to handle files with differing extensions and not copy files that may be harmful, but share the same file name minus extension. The new format looks like:

–[[
This file lists every asset to build
]]

return
{
— Shader Programs
{
builder = “ShaderBuilder.exe”,
assets =
{
— Vertex shader programs
{
name = “vertexShader”,
extensions =
{
source = “hlsl”,
target = “shd”
},
arguments = { “vertex” },
},
— Fragment shader programs
{
name = “fragmentShader”,
extensions =
{
source = “hlsl”,
target = “shd”
},
arguments = { “fragment” },
},
},
},
— Meshes
{
builder = “MeshBuilder.exe”,
assets =
{
{
name = “cube”,
extensions =
{
source = “mesh.lua”,
target = “mesh”,
},
arguments = {}
},
{
name = “floor”,
extensions =
{
source = “mesh.lua”,
target = “mesh”,
},
arguments = {}
},
},
},
— Generic Assets
— (That just get copied as-is rather than built)
{
builder = “GenericBuilder.exe”,
assets =
{
{
name = “yellow”,
extensions =
{
source = “material.lua”,
target = “material”,
},
arguments = {}
},
{
name = “blue”,
extensions =
{
source = “material.lua”,
target = “material”,
},
arguments = {}
},
},
},
}

I built a file format for the mesh files. The mesh format specifies vertices and triangles. The system would require modification to use something other than triangles even though the file format supports using more vertices per primitive. The choice to make vertices and primitives without numeric labels was so that the system can get the count based on the size of the Lua table. That means that I do not need to tell the system how many vertices or primitives there are in the mesh. I assume the user will provide correct data for a mesh. I am in the process of allowing changes of material and texture, so that is currently a placeholder. The mesh file format looks like:

return
{
Material = “”,
Vertices = {
{
Coordinates =
{
X = -0.9,
Y = -0.9,
Z = -0.9,
},
Color =
{
Red = 255.0,
Green = 255.0,
Blue = 0.0,
},
},
{
Coordinates =
{
X = 0.9,
Y = -0.9,
Z = -0.9,
},
Color =
{
Red = 0.0,
Green = 255.0,
Blue = 0.0,
},
},
{
Coordinates =
{
X = -0.9,
Y = 0.9,
Z = -0.9,
},
Color =
{
Red = 255.0,
Green = 0.0,
Blue = 0.0,
},
},
{
Coordinates =
{
X = 0.9,
Y = 0.9,
Z = -0.9,
},
Color =
{
Red = 0.0,
Green = 0.0,
Blue = 255.0,
},
},
{
Coordinates =
{
X = -0.9,
Y = -0.9,
Z = 0.9,
},
Color =
{
Red = 255.0,
Green = 0.0,
Blue = 255.0,
},
},
{
Coordinates =
{
X = 0.9,
Y = -0.9,
Z = 0.9,
},
Color =
{
Red = 0.0,
Green = 255.0,
Blue = 255.0,
},
},
{
Coordinates =
{
X = -0.9,
Y = 0.9,
Z = 0.9,
},
Color =
{
Red = 150.0,
Green = 192.0,
Blue = 75.0,
},
},
{
Coordinates =
{
X = 0.9,
Y = 0.9,
Z = 0.9,
},
Color =
{
Red = 75.0,
Green = 192.0,
Blue = 150.0,
},
},
},
Primitives = {
{
[0] = 0,
[1] = 2,
[2] = 1,
},
{
[0] = 3,
[1] = 1,
[2] = 2,
},
{
[0] = 0,
[1] = 1,
[2] = 4,
},
{
[0] = 1,
[1] = 5,
[2] = 4,
},
{
[0] = 2,
[1] = 6,
[2] = 3,
},
{
[0] = 6,
[1] = 7,
[2] = 3,
},
{
[0] = 4,
[1] = 2,
[2] = 0,
},
{
[0] = 4,
[1] = 6,
[2] = 2,
},
{
[0] = 3,
[1] = 5,
[2] = 1,
},
{
[0] = 3,
[1] = 7,
[2] = 5,
},
{
[0] = 6,
[1] = 4,
[2] = 5,
},
{
[0] = 7,
[1] = 6,
[2] = 5,
},
},
}

There are a number of ways to write and read the binary data. I decided to reduce disk I/O by writing and reading the entire binary file at once. Accomplishing this goal required placing the data into a single memory allocation. There were four variables to copy, so I used four memcpy calls for writing into the memory allocation while keeping track of the offset. To read the file I read the entire binary file in at once. I cast the number of vertices and primitives off the start of the memory block. Using that number I get the vertices and primitive data using a single memcpy for each. With that data loaded my mesh class can pull the data from the reader.

Organization

I made a new project with shared file formats and file loading types. The tools i.e. the new mesh builder uses the loader for meshes to read in and write out the binary file. The engine uses the loader for meshes to load the binary mesh file. The data in the loader for meshes is the same, so it made sense to use a common class.

Issues

I didn’t remember why my mesh class had the count for both primitives and indices. I was using both in one place. During drawing I am using the number of primitives explicitly. My guess is and reason for leaving both the count for primitives and indices is to make it easier to add primitives with shapes other than triangles. If memory usage becomes an issue, a consideration will be if storing both numbers takes up too much room since there will likely be many meshes.

Time Estimate

The workload took approximately 12 hours. I have also worked in parts of assignment 08 while working on assignment 07. I am excited for getting both assignments done soon. The builders are going to allow some interesting engineering in the future.

Activity Time Spent
Implementing of Mesh Builder and Mesh Format 4 hours
Fixing AssetsToBuild.lua and Converting Mesh to Binary Data 4 hours
Incorporating the New Mesh Data into the Engine 3 hours
Write up 1 hour
Total 12 hours

Game Engineering Assignment 06

Controls

Box: W – Positive Z; S – Negative Z; A – Negative X; D – Positive X; Q – Positive Y; Z – Negative Y; R – Rotate in Positive X; H – Rotate in Positive Y; T – Rotate in Positive Z; V – Rotate in Negative X; F – Rotate In Negative Y; G – Rotate in Negative Z

Camera:  I – Positive Y; K – Negative Y; J – Negative X; L – Positive X; U – Positive Z; M – Negative Z; Insert – Rotate in Positive X; Delete – Rotate in Negative X; Right Arrow – Rotate in Positive Y; Left Arrow – Rotate in Negative Y; Up Arrow – Rotate in Positive Z; Down Arrow – Rotate in Negative Z

Abstract

The project has been updated with new builders inside the tools.  There are no visual changes.  The file structure now includes precompiled shaders.  The reasons for compiling at build time instead of run-time is for speed and reliability.  The program gains speed since clock cycles can be devoted to other tasks than compiling the shader.  The program gains reliability since build errors will before run-time helping to ensure the end-user does not end up with broken software.

Aims

The aim of the sixth assignment was to develop more build related tools to make the build process more efficient and robust.  Efficiency has been gained since shader compilation now happens at build time allowing more power for other tasks.  Robustness was gained by building a structure for builders that allows more than one builder to run a process with input from a human readable file.  Human readable files provide a means for none programmers to change the project without neccesarily needing the aid of an engineer.

Implementation

I built a Lua file structure for loading shaders.  Using an example from John-Paul I made modifications to the generic builder to specify the file extension in the arguments of the file.  The generic builder does a direct copy of data, but needs to know the extension of the file.  There is likely a programmatic solution to the problem, but for now users need to express the extension of the file.

The builders to this point are the Shader Builder, Mesh Builder and Generic Builder.  The Shader Builder compiles shader files that have been placed in the assets folder and added to the list.  The Mesh Builder is a work in progress and will compile mesh data to a binary file helping to streamline asset creation.  Lastly, the Generic Builder copies files directly with no modification.

Organization

My Lua file specification for sending assets to the asset builder employs a human readable format similar to the one employed by John-Paul.  I wanted to be able to change extensions, so I kept the extensions.  I extended the generic builder to require the file extension as an argument.  I will be looking for a programmatic way to handle extensions.

Here is the structure of the asset build list Lua file.

–[[
This file lists every asset to build
]]

return
{
— Shader Programs
{
builder = “ShaderBuilder.exe”,
extensions =
{
source = “hlsl”,
target = “shd”,
},
assets =
{
— Vertex shader programs
{
name = “vertexShader”,
arguments = { “vertex” },
},
— Fragment shader programs
{
name = “fragmentShader”,
arguments = { “fragment” },
},
},
},
— Meshes
{
builder = “MeshBuilder.exe”,
extensions =
{
source = “mesh.lua”,
target = “mesh”,
},
assets =
{
{
name = “cube”,
},
{
name = “floor”,
},
},
},
— Generic Assets
— (That just get copied as-is rather than built)
{
builder = “GenericBuilder.exe”,
extensions =
{
source = “txt”,
target = “txt”,
},
assets =
{
{
name = “material”,
arguments = { source = “lua” }
},
},
},
}

There are dangers to the file format.  The file runs the risk of getting long.  Ultimately developing a tool to manipulate the file without text will be necessary.  The design is programmer friendly, but will get unwieldy for non-programmers.

Issues

I had an issue with loading the shader binary data.  I thought it needed to be loaded into ID3DXBuffer somehow.  Saumya pointed out that the data was just binary data and could be used directly through a DWORD*.  John-Paul added that any pointer to raw data would work including DWORD* or void*.  I have worked with memory directly, but for some reason I thought it needed to be in a special struct like ID3DXBuffer.  I am grateful for the responsiveness of the engineers in the course.

Time Estimate

Given that fall break was during the assignment the workload wasn’t bad.  The problems were minor and light.  I do estimate I was within my 12 hour estimate for coursework from the class.  I am going to try to get the mesh loader done early so I can spend more time working on IGF related projects.

Activity Time Spent
Implementing New Builder Information 4 hours
Architecting Lua Reader for New Asset List 2 hours
Adding Shader Loading to Game Engine 2 hours
Write up 1 hour
Total 9 hours

Game Engineering 2 Assignment 05

Controls

Box: W – Positive Z; S – Negative Z; A – Negative X; D – Positive X; Q – Positive Y; Z – Negative Y; R – Rotate in Positive X; H – Rotate in Positive Y; T – Rotate in Positive Z; V – Rotate in Negative X; F – Rotate In Negative Y; G – Rotate in Negative Z

 

Camera:  I – Positive Y; K – Negative Y; J – Negative X; L – Positive X; U – Positive Z; M – Negative Z; Insert – Rotate in Positive X; Delete – Rotate in Negative X; Right Arrow – Rotate in Positive Y; Left Arrow – Rotate in Negative Y; Up Arrow – Rotate in Positive Z; Down Arrow – Rotate in Negative Z

Abstract

The project has been updated such that there are two meshes.  The first mesh is a cube of varied color.  The second mesh is a solid color plane below the cube.  The idea is the plane below the cube will show distance in Z well.  The camera is a new addition as well.  The project is now rendering in 3D including 3D movement.

Aims

The aim of the fifth assignment was to provide instruction in moving from 2D to 3D.  The move tested my architecture by being a forceful hand toward separating out data associated with meshes, materials and cameras.  The assignment also allowed me to understand debugging shaders using PIX.  I also learned about moving objects between spaces.  The space transitions were model => world => view => screen.

Implementation

I built a system that separates mesh, material and camera.  I encapsulated my shader loading code and material file loading code.  My guess is that future assignments will be moving away from loading shaders directly from file path, so the sooner I can get the shader encapsulated the better.   I am allowing definition of meshes from outside the mesh class.  I still need to be able to make graphics and actor changes from the game.  I will be working on Engine and Game separation if I find spare time.

Screen Shots

Debugging information using PIX for a Vertex Shader.  The model has been moved to world space and is now being moved into the view.  The fifth assignment required moving from Model => World => View  => Screen spaces.
Debugging information using PIX for a Vertex Shader. The model has been moved to world space and is now being moved into the view. The fifth assignment required moving from Model => World => View => Screen spaces.
The new colorful cube is at the rear of the gray plane.
The new colorful cube is at the rear of the gray plane.

Organization

I finally made separate classes for the material loader, material, mesh, camera, fragment and vertex shader.  I made adding meshes something that is done a layer above the mesh class.  I am going to continue to extend the classes so that Game can define actors with meshes or cameras.

Issues

I had a registry issue that didn’t allow me to use PIX.  I fixed the issue by modifying my registry.  I made a list of steps to fix the issue.  The steps are slightly informal and may be ambiguous, so please let me know and I will try to fix them.

1)  If you are getting an error saying something like “need to modify registry” and you say okay, but it says it can’t change your registry then your installation of DirectX and Direct3D are currently using TrustedInstaller permissions.
2)  Open regedit by pressing the start button and in the search bar typing regedit.
3)  Navigate to registry path “HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Direct3D”.
4)  Right click on the thing that looks like a folder.
5)  In the resulting menu click permissions.  Permissions is the second to last option.
6)  Click on Administrators and try to give Administrator “Full Control” by checking the allow box and then Apply.  If you get an error continue to the next step.
7)  You need a bigger hammer.  Press the “Advanced” button.
8)  A menu should pop up.  Click on the owner tab.
9)  Owner is probably TrustedInstallers.  Make the owner Administrators.  Highlight Administrators and click apply.  The name should change.
10)  Hit the okay button to leave the Advanced modal Window.
11)  Now give Administrators Full Control by selecting the check box.
12)  Go back to PIX and try to allow it to change the registry again.
13) If that fails, go back to start and enter into the search bar “directx control panel”.  Find the directx control panel tool.
14)  In DirectX Control Panel select “Direct3D 9” tab.
15)  Click “Enable Shader Debugging” and hit apply.
16)  Restart DirectX Control Panel and make sure the checkbox is checked for “Enable Shader Debugging”.
17)  That’s all I have for fixing the issue.  Enjoy!

Next Steps

OBTAIN A BETTER SEPARATION BETWEEN GAME AND ENGINE (Like using caps makes it more important or something).  Continue to abstract the graphics engine.  Make primitive types for cubes, quads and triangles for faster assembly.

Time Estimate

The workload was tolerable.  Much better than last week.  The class was warned that this assignment would be telling if we like graphics or not.  I like graphics.  Also, 12 hours is about where I want to be on assignments.

Activity Time Spent
Encapsulating Graphics Classes 5 hours
Implementing 3D Graphics 3 hours
Debugging 3 hours
Write up 1 hour
Total 12 hours

Game Engineering Assignment 03

Abstract

The first part of the third assignment was to draw a rectangle on the screen.  The second part was to analyze the rectangle using PIX.  The third part was to make a file called a material composed of the location for shaders that describe that material.

Aims

The aims of the third assignment were to introduce index buffers, continue working with PIX for debugging and start to abstract rendering by removing hard-coded paths to materials.

Implementation

Index Buffers

Adding the index buffer was relatively simple.  Index buffers are a method for drawing triangles that share vertices.  The approach reduces memory usage in the vertex buffer when there are shared vertices with the best case being a single vertex making up a brand new triangle since two of the other vertices are shared.

I developed a system to generate random rectangles.  The current shader pipeline needs a position and color.  The coordinate system for drawing in the current viewport is in normalized device coordinates.  That means that the center of the screen is (0,0).  The top left corner is (-1, 1).  Coordinates occupy the range ([-1 to 1], [-1 to 1]) with the first coordinate determining horizontal position and the second determining vertical position.

In order to provide a random rectangle I generate a coordinate for vertical and horizontal positions.  The bottom left coordinate is the negative vertical and horizontal values.  The top left coordinate has a negative horizontal value.  The top right coordinate has no negative values.  The bottom right coordinate has a negative vertical value.  The idea came from the way quadrants are divided in Cartesian coordinate systems about the origin (0,0).

PIX

The third assignment is the second assignment to use PIX.  PIX is useful for debugging DirectX graphics information drawn to the screen.  The new call used to draw the index buffer is DrawIndexedPrimitive().  The coordinates and color of each vertex is shown in the “Details” window under “Mesh” and “PreVS”.

The rectangle randomly generated using quadrant based coordinates.
The rectangle randomly generated using quadrant based coordinates.
Pix Data for the rectangle generated using quadrant based coordinates.  The call used was DrawIndexedPrimitive().  The index buffer is shown under Pre VS.
Pix Data for the rectangle generated using quadrant based coordinates. The call used was DrawIndexedPrimitive(). The index buffer is shown under Pre VS.  The table from left to right is VTX is the vertex number, IDX is the index number, position is the position provided to the vertex shader, diffuse is the color provided to the vertex shader.  VTX corresponds to the list of vertices.  IDX are the indices used for each vertex.  There are four IDX values and six VTX values.  Each IDX value has specified position and color.  VTX reuses indices from IDX to get the same position and color.

Material Asset

Building a material asset required developing a file format.  The information needed in the material asset is the path to the vertex and fragment shader.  I made the decision to name the file “material.lua” since the file is a Lua file and I am not worried about obscuring the data.  In professional projects obscuring the data might be useful.  I decided to use a human readable format to allow easier authoring of assets which is something Lua provides with a format similar to JSON.

My material.lua file has the following contents:

return
{
shaders = {
vertex = “data/vertexShader.hlsl”,
fragment = “data/fragmentShader.hlsl”,
}
}

In my rendering system I have put a method for breaking the table apart in place.  I used a system that iterates between shaders in the table and determines if the provided shader is useful to the system.  If the shader is specified and not used in a debug build the system will let the user know.

The reason for using a shader specification is that a material may be composed of more than shaders.  I want to keep that option open.  Materials can be thought of as wood or glass so translucency may be a property that a material needs.  Maybe materials have different physical properties for the physics system.

I added the material.lua file to my BuildAsset system.  The system will build the material.lua file along side the shader files.  I will be encapsulating the material system as the need becomes more pressing.

Issues

The work was primarily in reading source examples from John-Paul.  I am becoming more proficient in reading the documentation.  I chose not to implement a separate material class yet.  I realize that implementing the material class will help in the future and plan to implement the material class.

Next Steps

Implement the material class!

Acknowledgements

John-Paul’s documentation and source code proved invaluable.  I haven’t contributed since the first assignment.  There seem to be fewer questions from the class overall.

Time Estimate

The work load for the project felt like less than previous projects.  I do not mind having less to do.  I was surprised at how easy the project came together.

Activity Time Spent
Adding Index and Vertex Buffer with Randomized Vertices 2 hours
Analyzing with PIX 1 hours
Adding Lua to Graphics to Allow Changing Shaders 3 hours
Write up 1 hour
Total 7 hours

Game Engineering 2 Assignment 02

Abstract

 The first part of the second assignment added color to the rendering pipeline.  The outcome is a colorful triangle.  The learning objective was to learn about the rendering pipeline particularly that the vertex shader passes information (i.e. color) to the pixel shader.

The second part of the second assignment used PIX to analyze pixel data.  The outcome was a screenshot of the colorful triangle and the color values of the triangle when DirectX draws the triangle.  The learning objective was to understand debugging frames from DirectX.

The third part of the second assignment added scripting to the tool that ensures assets needed by the game are built and in the correct directories.  The outcome was a Lua script that replicated behavior done in the prior assignment in C++.  The behavior copies files from a source to a target folder.  The learning objective was understanding the interface between C++ and Lua.

Aims

The aims of the second assignment were to understand that there is a rendering pipeline, debugging rendered frames using PIX and incorporating Lua into C++ projects.

Implementation

Rendering System

The rendering system from assignment 01 drew a white triangle.  For assignment 02 the triangle was colored by adding color information to the struct passed to the rendering system.  In order to add color the layout of the memory being passed to the rendering system needs to be specified.  For the purposes of assignment 02 the layout was two floats for position and a 32-bit integer as a DWORD for color.  A float is 4 bytes on a 32-bit x86 system which means the offset for color was 8 bytes.

The entirety of that memory is passed into the rendering system starting with the vertex shader.  The vertex shader sets position and color outputs.  I am not entirely sure how the pipeline gets the output color information to the pixel shader since the vertex shader produces two outputs.  The pixel shader gets the color value from the vertex shader and sets the color for the pixel.  The color is linearly interpolated between pixels resulting in a gradient of color between the pixels that were drawn for that shape.  Assignment 02 is a triangle which means there are three vertices.  I populated the vertices with random numbers in the range of [0-255] using a call to D3DCOLOR_XRGB.  D3DCOLOR_XRGB takes integers in the range of [0-255] for red, green and blue and gets a representation as a D3DCOLOR value.

PIX is going to be a useful debugging tool.  PIX allows a DirectX program to run and frames to be sampled by pressing F12.  The frame can then be analyzed.  The DirectX calls are shown similar to a call stack.  Highlighting any of the calls shows information about that call.  For assignment 02 we looked at the mesh details for the DrawPrimitive call which is responsible for drawing the triangle.

Triangle from PIX
Triangle from PIX  – I generated a lot of random triangles.  I selected this triangle.  I analyzed the triangle with PIX

PIX data for the triangle I selected.  In the events region I have selected DrawPrimitive.  The details are related to that call since I have it highlighted.  I am looking at the mesh details.  There are three rectangular boxes with triangles in them that show the triangle before the vertex shader was applied, after the vertex shader was applied changing the position of the triangle and where the triangle was placed in the viewport (i.e. on the screen).  Underneath there are values which correspond with each vertex.  The position of each vertex and color are shown respectively.  The color numbers are alpha, red, green and blue.  Alpha for each vertex is full.  The colors for the other vertices in a hex and decimal format respectively are [0]->{(A3, 39, 15) or (163, 57, 21)}, [1]->{(59,F6,0E) or (89,246,14)} and [2]->{(6B,1A,AA) or (107,26,170)}.
PIX data for the triangle I selected. In the events region I have selected DrawPrimitive. The details are related to that call since I have it highlighted. I am looking at the mesh details. There are three rectangular boxes with triangles in them that show the triangle before the vertex shader was applied, after the vertex shader was applied changing the position of the triangle and where the triangle was placed in the viewport (i.e. on the screen). Underneath there are values which correspond with each vertex. The position of each vertex and color are shown respectively. The color numbers are alpha, red, green and blue. Alpha for each vertex is full. The colors for the other vertices in a hex and decimal format respectively are [0]->{(A3, 39, 15) or (163, 57, 21)}, [1]->{(59,F6,0E) or (89,246,14)} and [2]->{(6B,1A,AA) or (107,26,170)}.
Scripting for Building Assets

Incorporating Lua scripting into the tool to build assets involved adding a project for Lua.  Lua was compiled to a library and an include file for C++ was added to the tool for building assets.  My implementation of the tool for building assets required using private static functions.  Using C++ functions from Lua requires non-member or static functions.

Understanding the Lua stack was an interesting process.  Lua is dynamically typed and can return multiple values.  C++ is strongly typed and can only return a single value.  A stack is used by Lua to inferface with C++ in a way that supports getting proper types and returning multiple values.  Error handling can occur in either Lua or C++.

Organization

The addition of a library and script directory is helping to keep code more organized.  There are now two libraries compiled in my project.  I have lua.lib and Engine.lib.  They both go to a special folder.  Scripts have their own folder which helps keep everything organized.  I made the root of my project my solution directory removing an intermediary directory.

Issues

I had a hard time understanding the Lua states being handed to functions in C++.  I figured out that the Lua states were passed like the this pointer in C++ from a Lua call.  The parameters are passed on the stack, so in C++ the user must pop the values off to use them.

Next Steps

I need to refine the Lua to C++ and vice versa interface.  I would also like to remove the BuildAsset project.  I would like to get rid of specifying assets.

Acknowledgements

I was isolated on the project.  I was unable to take time to answer e-mails or ask questions this week.  John-Paul provided a lot of the code in samples and instructions for assembly.

Time Estimate

I will be setting a timer in the future.  I have estimated the time required for finishing the project.  Working with Lua took the bulk of the time.

Activity Time Spent
Adding Color to Rendering Pipeline 2 hours
Analyzing with PIX 1 hours
Adding Lua to Build Assets 6 hours
Debugging 2 hours
Write up 1 hour
Total 12 hours

Game Engineering 2 Assignment 01

Abstract

Asset tool chains are an important part of getting art and graphics related features into games.  With written directions and source code from John-Paul, the Game Engineering 2 instructor, I was able to build a system that moves specified files to the data directory of the compiled game.  In order to perform the task I had to change Visual Studio related settings including projects, property sheets, macros and libraries.  The source code provided by John-Paul was purposefully built with no inherent architecture.  I built an architecture separating tools, game and engine.  My architecture has also been built with considerations of portability encapsulating Windows and DirectX systems within the world system in my engine.

Aims

The aims of my first project were to build a framework for tools, game and engine.

Architecture Details

Tools

The tools system is currently an asset build tool that ensures the assets such as high level shader language (HLSL) files are accessible to the game executable in a clean package.  I have left the source files provided by John-Paul largely unchanged with the exception of moving the files to an asset builder class.

There was name clashing between Windows functions and the names provided, so I added FromWindows to the names.  In the future the names need to be changed in a better way, but without knowing the next step I am content to leave them with similar names to prevent confusion.  I will be vigilant to the potential problems I have expressed as the project moves forward.

Game

The game system is responsible for running the game loop.  In the future the game system may become a number of classes.  The game presently creates and initializes the world from the engine and then runs the game loop by calling update on the world.

The game can’t actually modify anything in the engine yet.  In the future I will need a way for the game to provide input to the engine that changes what is shown on screen.  A possibility for the current project was to have the game provide information about the white triangle, but that seemed like over-engineering and could end up hurting me as we continue to develop the systems.

Engine

The engine system is useful as a way to abstract details that gameplay programmers should not need to know.  As I was architecting the systems I recognized that creating a Window and graphics system is not part of the game.  I made a design decision that the Windows and graphics systems should be hidden from the gameplay programmers through a wrapper.  I called the wrapper the world system.

The world system knows about the current frame in the game.  The game includes the world and accesses features of the engine such as creating Windows and graphics systems through initialization, update and shutdown functions.  In the future to consider having multiple small classes allowing access to engine features, so the user of the engine can include smaller amounts of functionality and the world class doesn’t get bloated.  I am also concerned about the project being portable which means I need a way to ensure the user is only getting what is required for their operating system.

Issues

When I started the project I was unsure regarding the separation between getting the project running and architecture.  Now I believe they were blended for the assignment.  There was a lot of reading and I had to double check myself to ensure that I didn’t miss requirements.

I had the project running as explained in John-Paul’s directions before starting architecture work.  I drew my architecture out on paper before starting the architecture related work.  The basic idea is similar to the architecture used in Game Engineering 1 with the addition of tools.

I had an issue involving closing the window with the red X.  The object would be destroyed and then I would call DestroyWindow on the already destroyed object.  The callback function for Windows messages was static, but my handle for the Window was not static and therefore could not be accessed in the callback function.  I was worried about having a leaked Windows handle.  After an e-mail from John-Paul and testing for leaks I found that instead of a leaked Windows handle I had a stale pointer.  I changed my handle to static and when the X is pressed the handle which is now a stale pointer is set to NULL before being destroyed.

I also had an issue with linking DirectX libraries in a static library and then using that library.  Libraries should be included at the application level and not in static libraries.  Including libraries in a static library and then using the library causes a duplication linker warning.

Next Steps

I am pleased with the architecture I have been building.  I would like to start merging code from Game Engineering 1 and AI into the system.  I would also like to start digging deep outside of class regarding engines, graphics and AI.

Acknowledgements

John-Paul was helpful on the project helping solve the Windows handle issue and the linker error for including static libraries in static libraries.

Ron Romero commiserated on deleting the Windows handle more than once which helped me continue trying to find the source of the linker error.

Jamie King helped find a logic issue in my shutdown functions.  I was using “XOR” to evaluate two conditions.  If both were true then the result would be false.  I meant for the logic to be “AND” so that if either condition were false then there was a problem shutting down.

Time Estimate

I am not used to tracking my time, but will be doing so officially in the future.  I like the idea of associating time with commits.  Having said that my first breakdown of time is an estimate.

Activity Time Spent
Initial Working Build 6 hours
Pre-planning Architecture 1 hour
Building Architecture 6 hours
Squashing Bugs 2 hours
Write up 1 hour
Total 16 hours