Fall 2014 Post-Mortem

Earlier during the semester I put together thoughts regarding the state of the semester.  I would like to proclaim that the state of the Hostile Territory team has improved since that time.  The Hostile Territory team consists of five producers, five engineers, three technical artists and two artists.  Weighing in at fifteen people the team is large.  The team has experienced growing pains due to size.  Overall, the team is in a better place than the start of the semester.  I will now detail the journey provided by the semester.

I had sworn off large teams when I left Retro Yeti in late June 2014.  Retro Yeti, at that time, suffered from a culture that focused on last minute heroics toward a goal discussed by leads in closed meetings.  I attributed the problem to large teams and left to form a new team.  The goal was to get two producers, four engineers and two artists on the team.  The new team formed with only three people.  As a new team we built four prototypes.  The prototype Roller ended up going on to the “Utah Game Wars” competition as a finalist and is now submitted to the “Independent Games Festival”.  Overall the summer prototyping session was a success, however when the semester began it was apparent that the fifth team would not form.

I decided to enjoy the success of Roller and let go of the creation of a fifth team.  The first day there was a meeting about team formation.  Other people had left their original team, but instead of joining the fifth team, went to safer teams.  I spoke briefly with Jose and Ryan regarding my situation and speed dating teams was mentioned, so I set off to find a team.  I worked with Rody and Rob on Reflex Speed in the spring and decided it would be a good idea to join the Hostile Territory team.  I asked to join the team and was welcomed.  The talk of the team was switching from Unreal Engine 4 to Unity.  I decided that I would be a good team member and keep my efforts to building whatever unassigned tasks were deemed important.

The engineers had a first meeting regarding getting a Unity build up and working.  I chose to do networking with George.  There were synchronization issues in the build from last spring.  I looked over the network code George had and realized that the connection being used was UDP peer-to-peer.  While fast, a UDP peer-to-peer connection is not synchronized by design.  George and I talked and came to a conclusion that we needed a client – dedicated server system to ensure synchronization.  The game should run on the server with each client sending remote procedure calls to their character to ensure shooting was synchronized.

I was working on a client – dedicated server system in the first week.  I had movement working for more than one client using remote procedure calls when Triston, the engineering lead, asked me to leave networking behind in favor of split screen.  Triston asked me to start work on a third person controller.  I started work on the controller and was later asked if Ron helped on the controller if that would help.  Ron and I began working on controllers.  As Ron and I started to work together it became apparent that we were very different engineers.  Ron spent time building up interfaces that every controller should use.  I spent time building a controller that could wall walk.

During the second or third week the term re-architecting started being thrown around by Triston and Ron.  The code that had been written for the game was seen as inadequate and was being rewritten in a less “hacky” way.  I was mortified.  The Hostile Territory team seemed to be doing well, but the re-architecting with leads looked, smelled and walked like the problems from Retro Yeti.   Engineering became stagnant for the end of September as the new architecture was being developed.  Attitudes on the engineering team went from fun to fear.  Shelwin started work on a weapon system that was compartmentalized from the new architecture.  I kept myself busy working on the wall walking movement controller.  I vocalized my distaste of the re-architecting process.  I started to let my 9 AM to 1 PM schedule for Hostile Territory slip since the new architecture was up in the air.  Other engineers were biding their time waiting for the release of the new architecture.

The new architecture was released.  I updated the wall walking controller to meet the interface needs of the new architecture.  There were still bugs with the wall walking controller.  Triston told me to remove wall walking and get the controller into the game.  I cobbled together a solution to get the controller into the game without wall walking.  The solution worked, but had problems.  Over the weekend Ron released the controller he built in combination with the architecture and it was better than my controller, so we moved forward with Ron’s controller.  I brought attention to the wasted effort from both Ron and I in developing separate controllers and asked production for help.  My moral tanked and I was unable to put my finger on any part of the project that was mine.  Ron and I started working on understanding our differences and the strengths and weaknesses of both of our approaches.  Ron and my competition started to become a mutual respect.

I changed gears and started working on bug fixes.  I spent time between bug fixes working on the wall walking controller.  I was able to fix a significant issue with yaw rotations beyond 90 degrees in either direction causing undefined movement.  I made strides with the wall walking controller and was working on getting the controller to work in environments with noisy surfaces.  Production didn’t have a specific task for me.  As a result I had extra time to work on wall walking, when not fixing bugs for the IGF build, since the feature still came up as good for the game.

I was working on two games for IGF.  I spent the morning with Hostile Territory and the evening with Roller.  Luckily the engineering classes gave extra time on due dates around IGF since I spent all my time working on Hostile Territory and Roller.  Hostile Territory was submitted on the Thursday before Halloween.  Roller was submitted on Halloween.  I was running on little sleep and took the weekend afterward off to recuperate after the IGF submissions.

The team played the IGF submission and came to a consensus that the game was not fun.  I stated the game we submitted was the same game we started with except with a new cave level.  During the process of building the IGF submission the team was informed that iteration would occur afterward.  The team pushed for prototyping.  There was backlash from production, but the team was able to get a prototyping phase approved.  As an engineer I am strong at prototyping and decided to take on three prototype ideas.  The first idea was from our artist Mark and involved blowing up minions to remove tiles.  The second idea was a play upon Mark’s idea involving non-permanent toggle removal of tiles like a light switch.  The third idea was Peijun’s musical chairs where players had to move to their own territory on a timer or fall.  I was able to prototype the first and second idea.  Peijun finished the third idea a week later.  As a team we presented six prototypes to Jose and Ryan.  The detonate and light switch prototypes were selected to move the design forward.  Both included the wall walking controller.  A major selling point for the new prototypes was clean visuals.

During the last weeks of the semester the team spent time getting a build ready for IGF.  I spent the majority of my time jumping helping others with tasks.  I have started to solidify my role on the team as a prototype engineer to ensure ideas on the team get a fair trial.  I do not have the dread I experienced during the middle of the semester.  The team seems to be working together better.  Each engineer has organically taken a role to help the team.  The roles are not all inclusive of our work, but serve to denote the strengths we individually have brought to the team.  The roles I identify are Triston as engineering manager, Saumya as GUI engineer, Ron as architecture and systems engineer, George as build engineer, Peijun as animation engineer, Shelwin as weapon, sound and shader engineer, Sty as technical artist and myself as prototype engineer.

The engineering team has been gaining steam since the IGF submission.  The culture is improving and we can get more work done in less time.  We have more respect for each other than when we started.  With the success of prototyping I would like to continue to provide engineering expertise to move the design forward.  If anyone on the team has an idea to move the design forward, after talking to Rody, I would love to build the design to allow the team to give the idea a fair trial.

Moving forward I will be addressing issues with the camera and wall walking controller.  Re-architecting has prevented the team from building some of the suggestions from Jose and Ryan.  I would like to prototype some of the ideas including a controller where the camera isn’t controlled directly and a fixed camera at either end of the tube.  Both ideas have design problems, but deserve a fair trial.  I will be working with Rody to find good solutions to design problems with both the light switch and detonate prototypes.

Game Engineering 2 Assignment 11

Controls

Atlas – [0-9] on the keyboard will show an image for [0-9].

Light: T – Positive Z; G – Negative Z; F – Negative X; H – Positive X; R – Add Red Light; V – Subtract Red Light; Y – Add Green Light; N – Subtract Green Light;

Camera: I – Positive Z; K – Negative Z; J – Negative X; L – Positive X; U – Positive Y; M – Negative Y; Insert – Rotate in Positive Z; Delete – Rotate in Negative Z; Right Arrow – Rotate in Positive Y; Left Arrow – Rotate in Negative Y; Up Arrow – Rotate in Positive X; Down Arrow – Rotate in Negative X

Objects: WSAD – Move Object, Up and Down is q and z.

Abstract

The scope of assignment 11 was to add resolution independent sprites with a different rendering mode to the game. The shaders involved needed to allow atlasing. In addition to changing rendering modes, changing sampling state was introduced.

Aims

The aim of the eleventh assignment was to implement resolution independent sprites, mixing indexed and not indexed drawing and understanding render states.

Implementation

My implementation started with improvements to the scene file. I am working toward an architecture where the user can leave data out of the scene to initialize data to default values. A case where the approach is most noticeable is determining keys that move an object. In addition I have added handling of input and movement controller in the scene file. The scene file currently contains all the actors in the game.

The scene file is being shown.  Each actor is surrounded in curly braces.  The type is specified and handled by the World at run-time when creating in game objects for the scene.  The name is the actor name in the game.  Since I presently only allow one light source the name Light is fitting.  The location is a vector in the XZ plane from the origin.  The rotation for a light is currently being used to determine color. The new part is the controller.  The movement part links to a controller in game that only does translation by a fixed velocity.  The input controller determines the action taken for input. The controls have changed so that the user should only define the directions that make sense.  For direction of the light Y doesn't make sense since the displacement in the XZ plane is the vector that is shifting the light.  That means that Y has been excluded from the movement.
The scene file is being shown. Each actor is surrounded in curly braces. The type is specified and handled by the World at run-time when creating in game objects for the scene. The name is the actor name in the game. Since I presently only allow one light source the name Light is fitting. The location is a vector in the XZ plane from the origin. The rotation for a light is currently being used to determine color.
The new part is the controller. The movement part links to a controller in game that only does translation by a fixed velocity. The input controller determines the action taken for input.
The controls have changed so that the user should only define the directions that make sense. For direction of the light Y doesn’t make sense since the displacement in the XZ plane is the vector that is shifting the light. That means that Y has been excluded from the movement.

I have defined sprites that render in screen space. I have defined a way to add sprites to the game through the scene file. The current implementation is less generic than I would like, so I will be tuning the implementation.

Sprites can be defined in the scene file.  Two sprite definitions are shown.  The first sprite is the logo and used the logo material.  The logo material has a texture that is the EAE 6320 logo.  The controller for the sprite is not defined.  When the scene file if there is no data for the controller the NoMove controller is given to the actor preventing motion during the update loop. To show that it does not need to be declared at all the second actor does not have a controller block.  The second sprite is a sprite sheet.  Presently to show individual numbers the image is atlased by an internal controller.  I will be adding a way to reference the controller from the scene file and provide specific UV coordinates on a per button basis in the future. I specify Top, Bottom, Left and Right.  As I was implementing sprites the idea of a top and left coordinate with a scaling factor came to mind.  I am currently only using the top and left coordinates and will be updating the sprite data.  Scaling the texture is not currently in the game.   Scaling is useful in my implementation because I am maintaining the size of the original texture currently.  With a scaling factor UI programmers could re-size the texture  in a data driven way. An interesting design specification of sprites is that they are resolution independent and hold position on the screen.  Having been a UI programmer I understand the importance of keeping the image not distorted and in the correct position.  A potential pitfall to my method is that the area of the screen taken by the sprite depends on resolution.  Adding flexibility for area may be a future iteration.
Sprites can be defined in the scene file. Two sprite definitions are shown. The first sprite is the logo and used the logo material. The logo material has a texture that is the EAE 6320 logo. The controller for the sprite is not defined. When the scene file if there is no data for the controller the NoMove controller is given to the actor preventing motion during the update loop.
To show that it does not need to be declared at all the second actor does not have a controller block. The second sprite is a sprite sheet. Presently to show individual numbers the image is atlased by an internal controller. I will be adding a way to reference the controller from the scene file and provide specific UV coordinates on a per button basis in the future.
I specify Top, Bottom, Left and Right. As I was implementing sprites the idea of a top and left coordinate with a scaling factor came to mind. I am currently only using the top and left coordinates and will be updating the sprite data. Scaling the texture is not currently in the game.
Scaling is useful in my implementation because I am maintaining the size of the original texture currently. With a scaling factor UI programmers could re-size the texture in a data driven way.
An interesting design specification of sprites is that they are resolution independent and hold position on the screen. Having been a UI programmer I understand the importance of keeping the image not distorted and in the correct position. A potential pitfall to my method is that the area of the screen taken by the sprite depends on resolution. Adding flexibility for area may be a future iteration.

I added new events for PIX. PIX has proven to be a fine way to debug DirectX related calls. I have been using it for all sorts of issues and feel comfortable with the features I have used.

PIX events have been created for both drawing meshes and drawing sprites.  The render state is modified based on whether meshes or sprites are being drawn.
PIX events have been created for both drawing meshes and drawing sprites. The render state is modified based on whether meshes or sprites are being drawn.
The sprites in the game are resolution independent.  The top left window is an 800 by 600 window.  The bottom window is 1600 by 600.  The leftmost window is 800 by 1600.  I didn't line them up pixel perfect, but the size is the same.
The sprites in the game are resolution independent. The top left window is an 800 by 600 window. The bottom window is 1600 by 600. The leftmost window is 800 by 1600. I didn’t line them up pixel perfect, but the size is the same.
The sprite 1 in blue on the top left side of the screen is being shown when the 1 button on the keyboard is pressed.  The 1 shown on the screen is from a sprite sheet.  The sheet has the numbers [0-9] on it.
The sprite 1 in blue on the top left side of the screen is being shown when the 1 button on the keyboard is pressed. The 1 shown on the screen is from a sprite sheet. The sheet has the numbers [0-9] on it.
The same sprite is being used, but now the texture UV's have changed to show the 6.  The 6 key is being held down.
The same sprite is being used, but now the texture UV’s have changed to show the 6. The 6 key is being held down.

Issues

I have traced the issues with my release build to the view to screen matrix using PIX. I am getting closer to resolving the issue. I have screenshots detailing the issue.

The PIX screenshot shows the information provided to the vertex shader for a cube.  The values for position are well defined.
The PIX screenshot shows the information provided to the vertex shader for a cube. The values for position are well defined.
After the vertex shader runs the values for x and y coordinates are huge.
After the vertex shader runs the values for x and y coordinates are huge.
The view to screen matrix is in register c4 to c7.  The numbers in release are not well defined.  In debug the values are between -10 and 10.  I have been looking into the creation of this matrix and will be creating a log file to figure out how to fix the issue.
The view to screen matrix is in register c4 to c7. The numbers in release are not well defined. In debug the values are between -10 and 10. I have been looking into the creation of this matrix and will be creating a log file to figure out how to fix the issue.

I also had an issue with my sprite fragment shader. In order to add alpha information the shader needs to output four floats. I was only outputting three floats. The result was a messy image. Detection of the issue involved loading the .dds file into Visual Studio and removing channels until the image matched. When the image looked the same I knew which channels I was missing in the fragment shader.

I also defined the triangles of a sprite wrong such that the triangles were overlapping with a wedge of black on the left side. The fix was easy to figure out from PIX debugging.

Time Estimate

The workload for the project was about 14 hours.

Activity Time Spent
Updating Scene File 4 hours
Implementing Sprites 4 hours
Tracking Down Bugs 4 hours
Write up 2 hour
Total 14 hours

Game Engineering 2 Assignment 10

Controls

Light: T – Positive Z; G – Negative Z; F – Negative X; H – Positive X; R – Add Red Light; V – Subtract Red Light; Y – Add Green Light; N – Subtract Green Light;

Camera:  I – Positive Y; K – Negative Y; J – Negative X; L – Positive X; U – Positive Z; M – Negative Z; Insert – Rotate in Positive X; Delete – Rotate in Negative X; Right Arrow – Rotate in Positive Y; Left Arrow – Rotate in Negative Y; Up Arrow – Rotate in Positive Z; Down Arrow – Rotate in Negative Z

Objects: WSAD – Move Object

Abstract

The project has been to add lighting to the game by means of modifying the fragment shader. The type of lighting added is called diffuse. Diffuse lighting treats lighting as a unidirectional field applied to each vertex of the mesh.

The normal to the vertex is interpolated and sent to the fragment shader where color is determined using the cosine of the angle. The cosine is obtained through the dot product since the normal vector and the diffuse lighting vector are both of unit length and the dot product is the magnitude of two vectors multiplied by the cosine of the angle between them. Light color is applied directly to that cosine angle creating a shaded look. An ambient light is also applied to the pixel to prevent the scene from being entirely dark.

Aims

The aim of the tenth assignment was to implement diffuse lighting and familiarize with PIX pixel debugging.

Implementation

My implementation required updating scene files for lights.  I had lingering bugs from the prior version that also needed squishing.  I am nearly done with bugs, but there is a rendering bug in release where objects are not being shown.  I will be investigating the issue using output to a log file.  The application no longer crashes while in release.  That issue was related to not building assets and not checking to see if there was a scene file.

I made some improvements to the scene file.  The light type has been added to the scene file.  I also fixed issues with order of the actors.  If non-renderable actors were placed before renderable actors the indexing was off for scene files.  I have fixed the indexing issue.

There have been improvements to the scene file.  A new type is light which indicates a light source in the game.  I am still uncertain how to add more than one camera or light source, but the format allows that expansion.
There have been improvements to the scene file. A new type is light which indicates a light source in the game. I am still uncertain how to add more than one camera or light source, but the format allows that expansion.

Ambient light is currently not an actor in the scene file.  In fact the ambient light has been hard-coded.  I have a screenshot with line number information included that does not give away any code.  The line number is 209 in Graphics.cpp.

The variable names for ambient light should remain the same until after grading.  I am going to change ambient light to be an actor in the scene file.  I am considering more separation between camera, lights and actors.
The variable names for ambient light should remain the same until after grading. I am going to change ambient light to be an actor in the scene file. I am considering more separation in the scene file between camera, lights and actors.

The project is currently being rendered better.  I changed the indices output from Maya from 1,2,3 to 1,3,2.  The shapes now look rounded with the addition of light.

The rabbit texture mapped wonderfully to the sphere in the scene.  The target texture did not map as well to the cube.  This is the first time I have used Maya for a project, so I am still learning how to map textures correctly.  The texture mapping may be the reason the floor is not lit well, but if I change the ambient color to be more red, green or blue the white of the floor shows prominently.
The rabbit texture mapped wonderfully to the sphere in the scene. The target texture did not map as well to the cube. This is the first time I have used Maya for a project, so I am still learning how to map textures correctly. The texture mapping may be the reason the floor is not lit well, but if I change the ambient color to be more red, green or blue the white of the floor shows prominently.

I added much more information to the mesh.  The data I currently have for each vertex is the XYZ coordinates, UV coordinates for texturing, normal XYZ coordinates for lighting, tangent and bitangent XYZ  values for curvature related functions and the coloring information for RGBA values.

The new mesh format includes a lot more information.  The vertex definition starts on line 5 and ends at line 39.
The new mesh format includes a lot more information. The vertex definition starts on line 5 and ends at line 39.

PIX is worthwhile for debugging.  I was able to debug a pixel.  The pixel information is shown below.

This is the rendering of the sphere with the rabbit texture.  I right clicked a gray colored pixel inside the rabbits ear.  The pixel data is shown in the next screenshot.
This is the rendering of the sphere with the rabbit texture. I right clicked a gray colored pixel inside the rabbits ear. The pixel data is shown in the next screenshot.
This is the shader that showed up when debugging the pixel in the rabbits ear.  The shader is being shown allowing debugging of the values by variable or register.  This is a debug version of the game.  I do not think PIX works with a release build of the game presently since we have not enabled the correct debugging flags for DirectX.  The symbols would not exist.
This is the shader that showed up when debugging the pixel in the rabbits ear. The shader is being shown allowing debugging of the values by variable or register. This is a debug version of the game. I do not think PIX works with a release build of the game presently since we have not enabled the correct debugging flags for DirectX. The symbols would not exist.

Issues

I am still having trouble with the release build.  The image is black.  I am going to dump information to a log file to track down the issue.

I have also been trying to figure out the math behind some of the work we have been doing for the project.  I will need to read text on the matter to really understand the math.  My understanding presently is that we are operating in a projected space.  I am excited to figure out more about the math behind graphics.

Time Estimate

The workload for the project was about 12 hours.

Activity Time Spent
Tracking down bugs from assignment 09 3 hours
Implementing lighting 4 hours
Adding lighting to the rest of the systems i.e. scene 4 hours
Write up 1 hour
Total 12 hours

Game Engineering 2 Assignment 09

Controls

Everything minus camera and floor: W – Positive Z; S – Negative Z; A – Negative X; D – Positive X; Q – Positive Y; Z – Negative Y; R – Rotate in Positive X; H – Rotate in Positive Y; T – Rotate in Positive Z; V – Rotate in Negative X; F – Rotate In Negative Y; G – Rotate in Negative Z

Camera:  I – Positive Y; K – Negative Y; J – Negative X; L – Positive X; U – Positive Z; M – Negative Z; Insert – Rotate in Positive X; Delete – Rotate in Negative X; Right Arrow – Rotate in Positive Y; Left Arrow – Rotate in Negative Y; Up Arrow – Rotate in Positive Z; Down Arrow – Rotate in Negative Z

Abstract

The project has been updated with a new Maya Exporter. The exporter allows meshes to be built into the format I use to import meshes into the game. The project now has a scene which allows me to make multiple actors with the same material and/or mesh. I can also change controls, orientation and movement speed. The project now has a settings file that allows changing height, width and full screen mode. The project now has PIX debugging information.

Aims

The aim of the ninth assignment was to get multiple meshes into the game from a complex program like Maya.

Implementation

A mesh I used in the game was sphere.mesh.lua. The file format with the mesh.lua isn’t working properly, so please specify the entire file name including extension.

The build that is downloadable is a debug build. The release build is pretending that 0 = -1231253242 which is out of range. I will need to fix the release build before next assignment. It was running in Release until I started playing with full screen mode.

I started by building the scene file. I had a strong interest in the file since last time. I wanted to be able to load meshes and materials on anything. I am now able to do so, but that implementation ate a lot of time. Hopefully the investment now pays off in the end.

An example scene file is:

***Modified from text to picture for clarity 11/19/2014***

PictureOfSceneFormat

***End of modification from text to picture for clarity***

Ultimately the file needs an editor for non-programmers. I haven’t built one yet. I will look into building one in the future.

Here is a screenshot of the game with four meshes and multiple textures:

Assignment09Rendering

Here is a screenshot of PIX information showing new events (Set Material and Draw Mesh):

Assignment09PixPicture

Issues

I ended up behind due to iterating for projects. I was able to meet my goals. I am still having issues with release mode, but I have no time to solve the issues.

***Added section with screenshots regarding the issue 11/19/2014***

The call that fails is sent with a value of 0:

BeforeSetupScene

The failure in the call is shown here:

InSetupScene

***End of section with screenshots****

Time Estimate

The workload with scene files and everything else was around 16 hours.

Activity Time Spent
Scene File 10 hours
Maya Exporter 3 hours
Full Screen 2 hours
Write up 1 hour
Total 16 hours

Game Engineering 2 Assignment 08

Controls

Box: W – Positive Z; S – Negative Z; A – Negative X; D – Positive X; Q – Positive Y; Z – Negative Y; R – Rotate in Positive X; H – Rotate in Positive Y; T – Rotate in Positive Z; V – Rotate in Negative X; F – Rotate In Negative Y; G – Rotate in Negative Z

Camera:  I – Positive Y; K – Negative Y; J – Negative X; L – Positive X; U – Positive Z; M – Negative Z; Insert – Rotate in Positive X; Delete – Rotate in Negative X; Right Arrow – Rotate in Positive Y; Left Arrow – Rotate in Negative Y; Up Arrow – Rotate in Positive Z; Down Arrow – Rotate in Negative Z

Abstract

The project has been updated with a texture builder. The new builder facilitates placing textures into the game. Textures are images that are wrapped around a mesh using UV coordinates. U and V were chosen because X, Y and Z were taken. U and V serve as coordinates to map a two dimensional image to a three dimensional mesh. The idea is akin to melting shrinky dinks onto an egg for Easter.

Aims

The aim of the eight assignment was to build a way to get textures into the game. The result is a system that allows multiple meshes, materials and textures to be loaded into the game. The process is mostly automatic which simplifies adding art to the game.

Implementation

I started by adding the texture builder project to my tools. I modified my AssetsToBuild script to handle the new texture builder. Here is the new layout of the AssetsToBuild file.

–[[
This file lists every asset to build
]]

return
{
— Shader Programs
{
builder = “ShaderBuilder.exe”,
assets =
{
— Vertex shader programs
{
name = “vertexShader”,
extensions =
{
source = “hlsl”,
target = “shd”
},
arguments = { “vertex” },
},
— Fragment shader programs
{
name = “fragmentShader”,
extensions =
{
source = “hlsl”,
target = “shd”
},
arguments = { “fragment” },
},
},
},
— Meshes
{
builder = “MeshBuilder.exe”,
assets =
{
{
name = “cube”,
extensions =
{
source = “mesh.lua”,
target = “mesh”,
},
arguments = {}
},
{
name = “floor”,
extensions =
{
source = “mesh.lua”,
target = “mesh”,
},
arguments = {}
},
},
},
{
builder = “TextureBuilder.exe”,
assets =
{
{
name = “Rabbit”,
extensions =
{
source = “png”,
target = “dds”,
},
arguments = {}
},
{
name = “Bricks”,
extensions =
{
source = “png”,
target = “dds”,
},
arguments = {}
},
},
},
— Generic Assets
— (That just get copied as-is rather than built)
{
builder = “GenericBuilder.exe”,
assets =
{
{
name = “bricks”,
extensions =
{
source = “material.lua”,
target = “material”,
},
arguments = {}
},
{
name = “rabbit”,
extensions =
{
source = “material.lua”,
target = “material”,
},
arguments = {}
},
},
},
}

I updated my material files. The new material file has specifications for textures. The texture and sampler are named explicitly. I would like to divide the material from texture, but am unsure how to do so at this time. This is the bricks material.

return
{
Shaders = {
Vertex = “data/vertexShader.shd”,
Fragment = “data/fragmentShader.shd”,
},
Texture = {
Path = “./data/Bricks.dds”,
Sampler = “g_color_sampler”,
},
Constants = {
PerMaterial =
{
ColorModifier =
{
Red = 1.0,
Green = 1.0,
Blue = 1.0,
}
}
},
}

I updated the mesh to include information about UV coordinates of the mapped texture. I also added information regarding the texture that is applied to the mesh. I would like to be able to divorce mesh and texture, but was unable to figure out a solution. Here is the floor mesh that has the bricks texture.

return
{
Material = {
Material = “./data/bricks.material”,
Texture = “./data/Bricks.dds”,
},
Vertices = {
{
Coordinates =
{
X = -10.0,
Y = -2.0,
Z = -10.0,
U = 0.0,
V = 1.0,
},
Color =
{
Red = 100.0,
Green = 100.0,
Blue = 100.0,
},
},
{
Coordinates =
{
X = 10.0,
Y = -2.0,
Z = -10.0,
U = 1.0,
V = 1.0,
},
Color =
{
Red = 100.0,
Green = 100.0,
Blue = 100.0,
},
},
{
Coordinates =
{
X = -10.0,
Y = -2.0,
Z = 10.0,
U = 0.0,
V = 0.0,
},
Color =
{
Red = 100.0,
Green = 100.0,
Blue = 100.0,
},
},
{
Coordinates =
{
X = 10.0,
Y = -2.0,
Z = 10.0,
U = 1.0,
V = 0.0,
},
Color =
{
Red = 100.0,
Green = 100.0,
Blue = 100.0,
},
},
},
Primitives = {
{
[0] = 0,
[1] = 2,
[2] = 3,
},
{
[0] = 0,
[1] = 3,
[2] = 1,
},
},
}

There were a lot of builder and file handler specific changes required for the new file formats. I had to ensure the new meshes, materials and textures were being loaded correctly. Textures are compiled by the TextureBuilder to the .dds format. Meshes are compiled to a binary format. Materials are copied as is to the data folder. The process is automated and helps get new assets into the game faster.

I have pictures of the new implementation. Here are two images. The first is with the cube facing the camera. The second is with the cube rotated. Both have the new textures. The brick texture turned out darker than expected. The rabbit texture worked well.

The rabbit and bricks have been applied to the cube and floor plane respectively.
The rabbit and bricks have been applied to the cube and floor plane respectively.
The cube has been rotated to show two sides with a rabbit reverse on the other side.  This result is due to the layout of UV coordinates.  The bottom is also visible, but the UV coordinates do not map to the bottom of the surface since I am only using eight vertices.  To map to the bottom there needs to be more vertices.
The cube has been rotated to show two sides with a rabbit reverse on the other side. This result is due to the layout of UV coordinates. The bottom is also visible, but the UV coordinates do not map to the bottom of the surface since I am only using eight vertices. To map to the bottom there needs to be more vertices.
This is a PIX capture of the SetTexture instruction.  The data at the pointer to the texture is being shown.  The texture is the rabbit texture being mapped to the cube.
This is a PIX capture of the SetTexture instruction. The data at the pointer to the texture is being shown. The texture is the rabbit texture being mapped to the cube.

Organization

I created a new texture class in the Engine. I made a new texture builder that compiles image files to .dds for import by the Engine. The data formats for mesh and material changed significantly. Overall the project was light on organization however.

Issues

I had the wrong path to textures and ended up with a black screen for awhile. I had to use PIX to figure out that the texture was being set to a NULL pointer. I was able to fix the problem due to using PIX.

Time Estimate

The workload took approximately 12 hours. I am excited with the work done on this assignment. Adding art through the pipeline will be much easier than what we are experiencing in projects. I look forward to the next assignment.

Activity Time Spent
Implementing the TextureBuilder 2 hours
Updating formats, file handlers and the engine 6 hours
Ensuring everything worked 3 hours
Write up 1 hour
Total 12 hours

Game Engineering 2 Assignment 07

Controls

Box: W – Positive Z; S – Negative Z; A – Negative X; D – Positive X; Q – Positive Y; Z – Negative Y; R – Rotate in Positive X; H – Rotate in Positive Y; T – Rotate in Positive Z; V – Rotate in Negative X; F – Rotate In Negative Y; G – Rotate in Negative Z

Camera:  I – Positive Y; K – Negative Y; J – Negative X; L – Positive X; U – Positive Z; M – Negative Z; Insert – Rotate in Positive X; Delete – Rotate in Negative X; Right Arrow – Rotate in Positive Y; Left Arrow – Rotate in Negative Y; Up Arrow – Rotate in Positive Z; Down Arrow – Rotate in Negative Z

Abstract

The project has been updated with a mesh builder. The new builder takes data from mesh files and turns that into a binary format that is quicker to load into the game. The game has a loader which will load any number of meshes into the game.

Aims

The aim of the seventh assignment was to create a builder for mesh data. The purpose is to avoid hard-coding of mesh related data. The project no longer creates random meshes.

Implementation

I started by modifying my AssetsToBuild.lua to specify extension for each item instead of for an individual builder. The reason for the change was that the system needs to be able to handle files with differing extensions and not copy files that may be harmful, but share the same file name minus extension. The new format looks like:

–[[
This file lists every asset to build
]]

return
{
— Shader Programs
{
builder = “ShaderBuilder.exe”,
assets =
{
— Vertex shader programs
{
name = “vertexShader”,
extensions =
{
source = “hlsl”,
target = “shd”
},
arguments = { “vertex” },
},
— Fragment shader programs
{
name = “fragmentShader”,
extensions =
{
source = “hlsl”,
target = “shd”
},
arguments = { “fragment” },
},
},
},
— Meshes
{
builder = “MeshBuilder.exe”,
assets =
{
{
name = “cube”,
extensions =
{
source = “mesh.lua”,
target = “mesh”,
},
arguments = {}
},
{
name = “floor”,
extensions =
{
source = “mesh.lua”,
target = “mesh”,
},
arguments = {}
},
},
},
— Generic Assets
— (That just get copied as-is rather than built)
{
builder = “GenericBuilder.exe”,
assets =
{
{
name = “yellow”,
extensions =
{
source = “material.lua”,
target = “material”,
},
arguments = {}
},
{
name = “blue”,
extensions =
{
source = “material.lua”,
target = “material”,
},
arguments = {}
},
},
},
}

I built a file format for the mesh files. The mesh format specifies vertices and triangles. The system would require modification to use something other than triangles even though the file format supports using more vertices per primitive. The choice to make vertices and primitives without numeric labels was so that the system can get the count based on the size of the Lua table. That means that I do not need to tell the system how many vertices or primitives there are in the mesh. I assume the user will provide correct data for a mesh. I am in the process of allowing changes of material and texture, so that is currently a placeholder. The mesh file format looks like:

return
{
Material = “”,
Vertices = {
{
Coordinates =
{
X = -0.9,
Y = -0.9,
Z = -0.9,
},
Color =
{
Red = 255.0,
Green = 255.0,
Blue = 0.0,
},
},
{
Coordinates =
{
X = 0.9,
Y = -0.9,
Z = -0.9,
},
Color =
{
Red = 0.0,
Green = 255.0,
Blue = 0.0,
},
},
{
Coordinates =
{
X = -0.9,
Y = 0.9,
Z = -0.9,
},
Color =
{
Red = 255.0,
Green = 0.0,
Blue = 0.0,
},
},
{
Coordinates =
{
X = 0.9,
Y = 0.9,
Z = -0.9,
},
Color =
{
Red = 0.0,
Green = 0.0,
Blue = 255.0,
},
},
{
Coordinates =
{
X = -0.9,
Y = -0.9,
Z = 0.9,
},
Color =
{
Red = 255.0,
Green = 0.0,
Blue = 255.0,
},
},
{
Coordinates =
{
X = 0.9,
Y = -0.9,
Z = 0.9,
},
Color =
{
Red = 0.0,
Green = 255.0,
Blue = 255.0,
},
},
{
Coordinates =
{
X = -0.9,
Y = 0.9,
Z = 0.9,
},
Color =
{
Red = 150.0,
Green = 192.0,
Blue = 75.0,
},
},
{
Coordinates =
{
X = 0.9,
Y = 0.9,
Z = 0.9,
},
Color =
{
Red = 75.0,
Green = 192.0,
Blue = 150.0,
},
},
},
Primitives = {
{
[0] = 0,
[1] = 2,
[2] = 1,
},
{
[0] = 3,
[1] = 1,
[2] = 2,
},
{
[0] = 0,
[1] = 1,
[2] = 4,
},
{
[0] = 1,
[1] = 5,
[2] = 4,
},
{
[0] = 2,
[1] = 6,
[2] = 3,
},
{
[0] = 6,
[1] = 7,
[2] = 3,
},
{
[0] = 4,
[1] = 2,
[2] = 0,
},
{
[0] = 4,
[1] = 6,
[2] = 2,
},
{
[0] = 3,
[1] = 5,
[2] = 1,
},
{
[0] = 3,
[1] = 7,
[2] = 5,
},
{
[0] = 6,
[1] = 4,
[2] = 5,
},
{
[0] = 7,
[1] = 6,
[2] = 5,
},
},
}

There are a number of ways to write and read the binary data. I decided to reduce disk I/O by writing and reading the entire binary file at once. Accomplishing this goal required placing the data into a single memory allocation. There were four variables to copy, so I used four memcpy calls for writing into the memory allocation while keeping track of the offset. To read the file I read the entire binary file in at once. I cast the number of vertices and primitives off the start of the memory block. Using that number I get the vertices and primitive data using a single memcpy for each. With that data loaded my mesh class can pull the data from the reader.

Organization

I made a new project with shared file formats and file loading types. The tools i.e. the new mesh builder uses the loader for meshes to read in and write out the binary file. The engine uses the loader for meshes to load the binary mesh file. The data in the loader for meshes is the same, so it made sense to use a common class.

Issues

I didn’t remember why my mesh class had the count for both primitives and indices. I was using both in one place. During drawing I am using the number of primitives explicitly. My guess is and reason for leaving both the count for primitives and indices is to make it easier to add primitives with shapes other than triangles. If memory usage becomes an issue, a consideration will be if storing both numbers takes up too much room since there will likely be many meshes.

Time Estimate

The workload took approximately 12 hours. I have also worked in parts of assignment 08 while working on assignment 07. I am excited for getting both assignments done soon. The builders are going to allow some interesting engineering in the future.

Activity Time Spent
Implementing of Mesh Builder and Mesh Format 4 hours
Fixing AssetsToBuild.lua and Converting Mesh to Binary Data 4 hours
Incorporating the New Mesh Data into the Engine 3 hours
Write up 1 hour
Total 12 hours

The Road to IGF: Roller Post Mortem

The road to IGF with Roller was interesting.  Roller was a game prototyped in fifteen hours during the course of one week.  My wife mentions that Roller was built in two evenings.  The remaining time was used for polish.  I built the prototype with Lonnie Egbert and John Schwarz.  The prototype was built as part of a movement known as the Fifth Team that occurred over the summer.  Spring semester working with Retro Yeti left me drained.  The members of the team are amazing, but the culture and ideals developed by the team did not fit me well.  I knew another semester with Retro Yeti would end poorly, so I left with the idea of forming a new team.  The new team would hold to the ideals of rapidly building ideas to find fun early while managing scope and expectations.  I was working a fourty hour a week internship with Blackrock Microsystems and working on contract work for another project, but decided to commit an aggressive schedule of fifteen hours a week to the Fifth Team.  I felt passionate about building a better culture for game development.

Prototyping with the Fifth Team was energizing.  John joined me from the beginning.  We began working in Unreal Engine 4.  The first week resulted in a climbing game that I found hilarious.  The idea fell flat for most people however, but the rope impressed Lonnie and I asked if he wanted to join us.  He agreed and we moved forward.

The second week we started considering ideas.  I proposed a game with marbles and dominoes.  There was consensus and everyone started working on games featuring marbles and dominoes.  John was working on an inclined runner style game.  Lonnie was working on making bricks explode when ran into by the marble.  I was working on a game where marbles had to jump on dominoes or fall into hot lava.  We were showing off our work and laughing as we went.  Lonnie proposed adding the rope from the previous game onto the marbles.  The three of us started talking about a jousting game with the rope.  I showed Lonnie and John how I built the rope and added it to the marble.  Lonnie then proposed making the rope act like a pogo stick.  Control of the ball was odd, so we abandoned the idea.  Jousting was almost in place when the rope was removed since it wasn’t adding anything to the game.  The second evening I had a top down camera with two marbles that had to push each other out of the square arena.  Erin loved the game and told me to continue building it.  Roller was born.

The third and fourth week there was mention of continuing the marble game.  Holding to the idea of the Fifth Team I insisted that we prototype another game.  Per John’s suggestion we started working on a game with heavy narrative.  We prototyped a Guess Who game that ended up being a game show.  Players didn’t know who they were, so they asked random boxes.  The boxes provided clues and the player had to guess who they were….  The game didn’t go anywhere.  We then started building a dungeon crawling game where narrative would be central.  There were some tech demos and a sample narrative was written, but there was no progress.

I met with Ryan regarding the Fifth Team.  We talked about the outcomes of the program.  We talked about the importance of playtesting.  We talked about how the thesis game produced while in the program doesn’t get people employment.  We talked about how important the networking offered by the program is with regard to future employment.  Overall the meeting went well, but I still felt like a pariah.  I knew the start of the semester would be rough.  The roughness was lessened by being a finalist in Utah Game Wars.

The week before classes started I decided to enter Roller into the Utah Game Wars contest.  Roller was a finalist in the competition.  I updated the game to have four players.  The games design focused on being simple enough that grandparents could play with grandchildren.  The game garnered smiles at Utah Game Wars.  Judges mentioned that it should be a mobile game.  After consideration Lonnie, Erin and I decided that Roller wouldn’t make sense as a mobile game.  I wanted the game to be played by families in the same way my family played games like Mario Party, Mario Kart or Super Smash Brothers during the holidays.

The semester began and I did not push for a Fifth Team.  There were two members of the team at the start of the semester.  John was not committed full-time to the team, so I was the sole member.  That meant I was speed dating teams.  I joined the Hostile Territory thesis game team since the team seemed closest to the ideals of the Fifth Team.  I wanted to split my time between school and working on Roller.  That is not what happened!

After Utah Game Wars Roller only saw active development the week before IGF.  I had an insane amount of work to do on the game to get it ready for IGF.  My goal after Utah Game Wars to have the game ready for IGF submission.  I added menus to the game in a day.  I added a new snake level in a day.  I added AI in a day.  I spent a day until 5 AM working on polishing all the features and adding a timed mode to the game.  The energy of the Fifth Team was back.  The ideals however were lost.  The game needed to be done now.  That means from start to the current state of Roller the game has been built in about fourteen days.  I am still passionate about the game.  I know it makes people smile.  Roller has been my outlet for when the thesis student project breaks down.

I am going to work on process for Roller.  Ultimately, the current process is broken.  Even with all my other engagements i.e. school and thesis project, I should not have to work as hard as I did this last week.  I worked as hard as I did because I am passionate about Roller.  I think it can bring a lot of happiness to players as they play with family and friends.  I am grateful for my wife for helping me stay intact in the building of Roller.  She has provided a lot of support.  Roller is rolling forward and will continue to move forward.

The Road to IGF: Hostile Territory Post Mortem

I have been taking some personal time after submitting to IGF last Friday.  Hostile Territory was submitted on Thursday.  I am writing a post mortem of the journey to IGF since the beginning of the semester.  The focus of this post is on Hostile Territory.

The semester started well.  Roller made waves by being a finalist in Utah Game Wars.  There was a lot of interest among the cohort in the game and I let it wane.  I did not think it worthwhile to gloat or try to create another team around Roller.  The Fifth Team was created to embrace rapid prototyping, finding fun ideas early while managing scope and expectations.  The idea was a hard sale over the summer and riding the hype of the contest seemed like a road to disappointment.  Instead Roller would become my personal project with the goal being an IGF submission.  I will save Roller progress for the next post and instead focus on Hostile Territory.

I joined the Hostile Territory team since from outward appearances they were operating close to the ideals of the Fifth Team.  My goal and intent for the team was not to do any design work.  I was going to contribute engineering talent without changing the game design.  Engineering spent the first two weeks converting the project to Unity.  I spent the first week working on network related code.  The second week networking had been abandoned by the team in favor of split screen.  My new focus was on building a character controller.  My goal when I joined the team was to have the character walk up the walls.  During the pitch for Hostile Territory the cylindrical environment had inspired questions regarding the state of gravity and wall walking.  I built a basic character controller and released it before the start of week three.  I continued to focus attention on wall walking.  Multiple engineers were working on a character controller since a character controller was determined high priority.  In the end having multiple engineers work on the same portion of the project cost the team a lot of time to iterate.

By the second week into the project there was growing discontent in the engineering ranks.  The code base was comprised of many quick implementations that did not necessarily work together.  The uncertainty regarding architecture and the outcry against the current code base made working nigh-impossible.  I continued work on wall walking since the ideas behind the new architecture were not being communicated.  I released a basic version of the character controller I was working on during the fourth week, but knowing that there was another version being worked on by the architecture curators I knew that the controller was temporary.  Adding the basic version of the controller marked the end of wall walking progress for Hostile Territory.  The focus changed to bug fixes for the new architecture.  The controller lasted for a week before the new controller was released.  At least the team had a character controller before October.  The new architecture was also released removing the fear and uncertainty preventing work from getting done.

Ideally, the new architecture was meant to save time as the project progressed.  The immediate outcome was creating an environment where engineers outside the architecture group were unable to contribute.  Another outcome was preventing iteration on the project by slowing down development.  Once the architecture was released around week eight or nine the team had the same game as at the beginning of the semester, but the code was different.

With a month left on the project and the same game the team started with running the team was moving again.  Rob built a level with ridges and better looking capture tiles.  The level was added to the game.  The character rig was added earlier.  The game was looking interesting.  I built a method for bringing the camera closer to the character.  The mesh in the level offered a transparency when the camera moved through it so the view wasn’t obscured like the other level.  Moving the camera in was dropped.  I consolidated the camera movement into one function and added first person mode.  First person mode didn’t make the cut.

At this point in the project there wasn’t much time left to IGF submission.  I worked on anything mentioned by producers or Triston.  There was a major bug where the game controllers were not functioning properly.  I added a fix that ensured the controls would stay working.  I overheard producers talking about a reticle that was fixed to the camera look, so I added a sprite to the screen and surprised producers with the implementation time of the fix.  I changed the grounding procedure to be a raycast within eplsion to combat bugs with sticking to walls and reigning death.  I removed random red lines from the build that were left over from the other aiming reticle.  I added a way to ignore player input while an animation plays for the games animators.  I prevented the game from moving when the game ends.  All of the bug fixes above don’t mean much to me.  Software maintenance is important, but it isn’t exciting.  I can be paid to do maintenance, but it is not my passion.  My passion is in developing systems that add functionality to a product.

The design of Hostile Territory needs work.  There is a lot of bloat.  The focus of iteration was on weapons which didn’t add much to the game.  The core mechanic of controlling territory to damage the other player gives level design priority since any spot the player can stand on that cannot be controlled is a dominant strategy.  The pacing is too slow.

All the negative points aside, I have enjoyed working with the Hostile Territory team.  I have butted heads with other team members and rubbed shoulders with others.  My initial impressions of the team were wrong.  The team does not iterate rapidly.  The team is not pursuing a narrowly scoped project centered around a consistent idea.  The team however has been able to produce the game they set out to build.  The engineers are working together better than at the start of the semester.  Enfranchisement is higher than at the beginning of the semester.  Production is doing work and starting to think critically about the game.  Art has worked fast and hard to ensure the game looks amazing.

I would like to give thanks to Ron, Triston and Saumya for their work toward getting a new architecture in place.  Though I do not agree with the reasoning and reactionary stance taken to build the new architecture, I do feel the cost and frustration will pay off as we move forward with the project.

I would like to acknowledge Ron for his work on the current character controller and architecture.  I am glad we have found a common ground regarding engineering.  We have admitted our difference in engineering.  Ron’s preference to refactor continually has been contrasted with my aversion to refactoring until the system is broken.  There are benefits to both methods.  Refactoring frequently is important for maintenance reasons.  Avoiding refactoring is important for rapid development reasons.  Hopefully we can continue to learn from each other.

I would like to acknowledge Triston for his work as an engineering manager.  Managing engineers is painful task, but Triston stepped up to the plate.  He has been a strong bridge between production and engineers.  He has devoted time to resolving conflicts in engineering and to being personable.

I would like to acknowledge Saumya.  The menus have been a great addition to the game.  I was amazed by the work done on the lobbing reticle.  The arc was awesome.  The current method of painting the ground with a target looks great as well.

I would like to give thanks to Rob and Mark for their work on the art.  The art is selling Hostile Territory.  The visuals are appealing and make the game look different than what it was at the start of the semester.  Rob’s level boosted moral at a critical time when moral was otherwise low.  Mark’s UI elements helped to resolve confusion with the design.

I would like to give thanks to Shelwin, Peijun, Owen and Sty.  Shelwin built a weapon system and has managed to keep the code intact during the architecture push.  Peijun spent time working animating the characters and asked frequent questions helping communication across the entire team.  Owen built sounds that were funny providing a sense of dark comedy to the game.  I think we should embrace the dark comedy.  Sty was able to build UI elements that communicated the game was in overtime and that the player was being hurt helping to remove confusion.

I would like to give thanks to production.  Though I still do not know what producers do, I have noticed the work that was done toward the end of the semester.  Getting sounds for the game was amazing.  Building a trailer was great.  Getting the game submitted to IGF on Thursday and not forcing the team on a death march is game changing.  Bravo!

I have grown from the Hostile Territory project.  A student project this size is a nightmare.  Normally student projects force one or two key players to do the work.  Hostile Territory is no different.  The team has stake in the project, but there were key players.  The key players for Hostile Territory were Ron, Rob and Triston.  The argument could be made that the project was held hostage, but without individuals putting a strong effort into the project it doesn’t go anywhere.  I am passionate about building neat features, but if I were left to my own devices without being able to change the design I would still be working on wall walking.

I will write about Roller in another post.  This post is way too long.  Roller deserves another post.

Game Engineering Assignment 06

Controls

Box: W – Positive Z; S – Negative Z; A – Negative X; D – Positive X; Q – Positive Y; Z – Negative Y; R – Rotate in Positive X; H – Rotate in Positive Y; T – Rotate in Positive Z; V – Rotate in Negative X; F – Rotate In Negative Y; G – Rotate in Negative Z

Camera:  I – Positive Y; K – Negative Y; J – Negative X; L – Positive X; U – Positive Z; M – Negative Z; Insert – Rotate in Positive X; Delete – Rotate in Negative X; Right Arrow – Rotate in Positive Y; Left Arrow – Rotate in Negative Y; Up Arrow – Rotate in Positive Z; Down Arrow – Rotate in Negative Z

Abstract

The project has been updated with new builders inside the tools.  There are no visual changes.  The file structure now includes precompiled shaders.  The reasons for compiling at build time instead of run-time is for speed and reliability.  The program gains speed since clock cycles can be devoted to other tasks than compiling the shader.  The program gains reliability since build errors will before run-time helping to ensure the end-user does not end up with broken software.

Aims

The aim of the sixth assignment was to develop more build related tools to make the build process more efficient and robust.  Efficiency has been gained since shader compilation now happens at build time allowing more power for other tasks.  Robustness was gained by building a structure for builders that allows more than one builder to run a process with input from a human readable file.  Human readable files provide a means for none programmers to change the project without neccesarily needing the aid of an engineer.

Implementation

I built a Lua file structure for loading shaders.  Using an example from John-Paul I made modifications to the generic builder to specify the file extension in the arguments of the file.  The generic builder does a direct copy of data, but needs to know the extension of the file.  There is likely a programmatic solution to the problem, but for now users need to express the extension of the file.

The builders to this point are the Shader Builder, Mesh Builder and Generic Builder.  The Shader Builder compiles shader files that have been placed in the assets folder and added to the list.  The Mesh Builder is a work in progress and will compile mesh data to a binary file helping to streamline asset creation.  Lastly, the Generic Builder copies files directly with no modification.

Organization

My Lua file specification for sending assets to the asset builder employs a human readable format similar to the one employed by John-Paul.  I wanted to be able to change extensions, so I kept the extensions.  I extended the generic builder to require the file extension as an argument.  I will be looking for a programmatic way to handle extensions.

Here is the structure of the asset build list Lua file.

–[[
This file lists every asset to build
]]

return
{
— Shader Programs
{
builder = “ShaderBuilder.exe”,
extensions =
{
source = “hlsl”,
target = “shd”,
},
assets =
{
— Vertex shader programs
{
name = “vertexShader”,
arguments = { “vertex” },
},
— Fragment shader programs
{
name = “fragmentShader”,
arguments = { “fragment” },
},
},
},
— Meshes
{
builder = “MeshBuilder.exe”,
extensions =
{
source = “mesh.lua”,
target = “mesh”,
},
assets =
{
{
name = “cube”,
},
{
name = “floor”,
},
},
},
— Generic Assets
— (That just get copied as-is rather than built)
{
builder = “GenericBuilder.exe”,
extensions =
{
source = “txt”,
target = “txt”,
},
assets =
{
{
name = “material”,
arguments = { source = “lua” }
},
},
},
}

There are dangers to the file format.  The file runs the risk of getting long.  Ultimately developing a tool to manipulate the file without text will be necessary.  The design is programmer friendly, but will get unwieldy for non-programmers.

Issues

I had an issue with loading the shader binary data.  I thought it needed to be loaded into ID3DXBuffer somehow.  Saumya pointed out that the data was just binary data and could be used directly through a DWORD*.  John-Paul added that any pointer to raw data would work including DWORD* or void*.  I have worked with memory directly, but for some reason I thought it needed to be in a special struct like ID3DXBuffer.  I am grateful for the responsiveness of the engineers in the course.

Time Estimate

Given that fall break was during the assignment the workload wasn’t bad.  The problems were minor and light.  I do estimate I was within my 12 hour estimate for coursework from the class.  I am going to try to get the mesh loader done early so I can spend more time working on IGF related projects.

Activity Time Spent
Implementing New Builder Information 4 hours
Architecting Lua Reader for New Asset List 2 hours
Adding Shader Loading to Game Engine 2 hours
Write up 1 hour
Total 9 hours

Roller News

I am now back to working on Roller.  I will get tasks done on Hostile Territory, but I am not in a high enough position to change the design.  Instead of spending time working on lone wolf projects to try and save the game, I have decided to use my energy to get Roller into IGF.  I will give both games the time they need.

I showed Roller off at the playtest.  The game has some major flaws.  Creating a new mode needs to be streamlined.  The menu needs art.  The collisions need to be elastic.  I did receive a complement about the scope of the game.  The game also has a clear goal with falling off being something to avoid.  I am excited to make the game amazing and will continue to bring it in for testing.