Monthly Archives: December 2014

Game Engineering 2 Assignment 13

Controls

Left and right arrow keys move left and right. Forward arrow starts the car moving. Back stops the car.

The ‘1’ key toggles debug. Press it quickly to see Christmas lights! Create your own light show!

"Almost Christmas" is a game about final projects, tedium and long nights.  The game has a Christmas theme and is a tunnel where players can't reach the end, but they can enjoy the light show by pressing the '1' key rhythmically.  In the end it is "Almost Christmas".
“Almost Christmas” is a game about final projects, tedium and long nights. In the end it is “Almost Christmas”.

Abstract

The scope of assignment 13 was a to make a game with the engine. The game I built is called “Almost Christmas”. “Almost Christmas” is a game about not being able to reach Christmas break. The experience is meant to inspire tedium in the player. The player is presented with an artificial choice of left and right. Moving forward moves the player closer to christmas, but before the player reaches Christmas they are warped back to the beginning. Here’s a picture of the game

Aims

My aim for the assignment was to built a more automatic asset process. I also wanted to update my scene file format. Both goals were met. The game was supposed to be an AI racing game where the player and an AI opponent navigate a grid and whoever reaches the goal first wins. With the asset and scene modifications I wanted to make the scope of the game was too large. I decided to make a game about my current emotional state. I long for Christmas. Capturing that in a game led to “Almost Christmas”, a game about the tedium of trying to reach Christmas break and getting set back.

Implementation

The bulk of my implementation time was spent on making the reloader. The reloader is a process that runs in the background and monitors the assets folder. When a change is made to the assets folder the asset that changed is determined. That asset is added to a list which is piped into the asset pipeline. The asset build process is ran and builds the edited assets. Here are examples:

The Reloader is loaded.  I have not made any modifications to files yet.
The Reloader is loaded. I have not made any modifications to files yet.
The Reloader is up and I have made edits to a scene, mesh, material, shader and texture file.  There are even errors that are being printed due to copying and pasting a fragment shader and then renaming it to be a vertex shader.  The not found error is due to the Reloader copying the copy file from Windows into AssetsToBuild.  Definitely some kinks in the reloader, but I do not need to touch AssetsToBuild very frequently.
The Reloader is up and I have made edits to a scene, mesh, material, shader and texture file. There are even errors that are being printed due to copying and pasting a fragment shader and then renaming it to be a vertex shader. The not found error is due to the Reloader copying the copy file from Windows into AssetsToBuild. Definitely some kinks in the reloader, but I do not need to touch AssetsToBuild very frequently.

The 2D UI element is the level. If another scene were made that scene would have the number 1 on it. I initially planned to have multiple racing maps, but had to make some concessions.

Overall, the asset builder has been the best tool from the class. I have been excited to expand the capabilities. My Reloader reduces the number of times I have to manually edit AssetsToBuild and execute AssetBuilder. Paired with a scene system that uses key value pairs and controllers I do not need to touch actual source code as frequently as I otherwise would allowing my engine to be more generic.

Issues

I ran into time constraints which required me to reduce the scope of my game. Building better tools took most of the time allotted for the assignment. I am happy with my choice to build better tools. I am going to continue to extend my tools, but tool development takes a lot of time.

I would like to thank JP for explaining asset build processes. I am going to refine the process since it saves time. Sophisticated tools take time to build, but save time if used with enough frequency.

Time Estimate

The workload for the project was about 50 hours plus or minus 15. I spent a lot of time on tools. The game was maybe a 4 hour endeavor after building tools.  I am tired.  I am going to sleep and enjoy Christmas.

Activity Time Spent
Reloader Tool 20 hours
Editting Scene Format 20 hours
Scoping the Game and Freaking Out 4 hours
Building the Game 4 hours
Write up 2 hour
Total 50 hours

Fall 2014 Post-Mortem

Earlier during the semester I put together thoughts regarding the state of the semester.  I would like to proclaim that the state of the Hostile Territory team has improved since that time.  The Hostile Territory team consists of five producers, five engineers, three technical artists and two artists.  Weighing in at fifteen people the team is large.  The team has experienced growing pains due to size.  Overall, the team is in a better place than the start of the semester.  I will now detail the journey provided by the semester.

I had sworn off large teams when I left Retro Yeti in late June 2014.  Retro Yeti, at that time, suffered from a culture that focused on last minute heroics toward a goal discussed by leads in closed meetings.  I attributed the problem to large teams and left to form a new team.  The goal was to get two producers, four engineers and two artists on the team.  The new team formed with only three people.  As a new team we built four prototypes.  The prototype Roller ended up going on to the “Utah Game Wars” competition as a finalist and is now submitted to the “Independent Games Festival”.  Overall the summer prototyping session was a success, however when the semester began it was apparent that the fifth team would not form.

I decided to enjoy the success of Roller and let go of the creation of a fifth team.  The first day there was a meeting about team formation.  Other people had left their original team, but instead of joining the fifth team, went to safer teams.  I spoke briefly with Jose and Ryan regarding my situation and speed dating teams was mentioned, so I set off to find a team.  I worked with Rody and Rob on Reflex Speed in the spring and decided it would be a good idea to join the Hostile Territory team.  I asked to join the team and was welcomed.  The talk of the team was switching from Unreal Engine 4 to Unity.  I decided that I would be a good team member and keep my efforts to building whatever unassigned tasks were deemed important.

The engineers had a first meeting regarding getting a Unity build up and working.  I chose to do networking with George.  There were synchronization issues in the build from last spring.  I looked over the network code George had and realized that the connection being used was UDP peer-to-peer.  While fast, a UDP peer-to-peer connection is not synchronized by design.  George and I talked and came to a conclusion that we needed a client – dedicated server system to ensure synchronization.  The game should run on the server with each client sending remote procedure calls to their character to ensure shooting was synchronized.

I was working on a client – dedicated server system in the first week.  I had movement working for more than one client using remote procedure calls when Triston, the engineering lead, asked me to leave networking behind in favor of split screen.  Triston asked me to start work on a third person controller.  I started work on the controller and was later asked if Ron helped on the controller if that would help.  Ron and I began working on controllers.  As Ron and I started to work together it became apparent that we were very different engineers.  Ron spent time building up interfaces that every controller should use.  I spent time building a controller that could wall walk.

During the second or third week the term re-architecting started being thrown around by Triston and Ron.  The code that had been written for the game was seen as inadequate and was being rewritten in a less “hacky” way.  I was mortified.  The Hostile Territory team seemed to be doing well, but the re-architecting with leads looked, smelled and walked like the problems from Retro Yeti.   Engineering became stagnant for the end of September as the new architecture was being developed.  Attitudes on the engineering team went from fun to fear.  Shelwin started work on a weapon system that was compartmentalized from the new architecture.  I kept myself busy working on the wall walking movement controller.  I vocalized my distaste of the re-architecting process.  I started to let my 9 AM to 1 PM schedule for Hostile Territory slip since the new architecture was up in the air.  Other engineers were biding their time waiting for the release of the new architecture.

The new architecture was released.  I updated the wall walking controller to meet the interface needs of the new architecture.  There were still bugs with the wall walking controller.  Triston told me to remove wall walking and get the controller into the game.  I cobbled together a solution to get the controller into the game without wall walking.  The solution worked, but had problems.  Over the weekend Ron released the controller he built in combination with the architecture and it was better than my controller, so we moved forward with Ron’s controller.  I brought attention to the wasted effort from both Ron and I in developing separate controllers and asked production for help.  My moral tanked and I was unable to put my finger on any part of the project that was mine.  Ron and I started working on understanding our differences and the strengths and weaknesses of both of our approaches.  Ron and my competition started to become a mutual respect.

I changed gears and started working on bug fixes.  I spent time between bug fixes working on the wall walking controller.  I was able to fix a significant issue with yaw rotations beyond 90 degrees in either direction causing undefined movement.  I made strides with the wall walking controller and was working on getting the controller to work in environments with noisy surfaces.  Production didn’t have a specific task for me.  As a result I had extra time to work on wall walking, when not fixing bugs for the IGF build, since the feature still came up as good for the game.

I was working on two games for IGF.  I spent the morning with Hostile Territory and the evening with Roller.  Luckily the engineering classes gave extra time on due dates around IGF since I spent all my time working on Hostile Territory and Roller.  Hostile Territory was submitted on the Thursday before Halloween.  Roller was submitted on Halloween.  I was running on little sleep and took the weekend afterward off to recuperate after the IGF submissions.

The team played the IGF submission and came to a consensus that the game was not fun.  I stated the game we submitted was the same game we started with except with a new cave level.  During the process of building the IGF submission the team was informed that iteration would occur afterward.  The team pushed for prototyping.  There was backlash from production, but the team was able to get a prototyping phase approved.  As an engineer I am strong at prototyping and decided to take on three prototype ideas.  The first idea was from our artist Mark and involved blowing up minions to remove tiles.  The second idea was a play upon Mark’s idea involving non-permanent toggle removal of tiles like a light switch.  The third idea was Peijun’s musical chairs where players had to move to their own territory on a timer or fall.  I was able to prototype the first and second idea.  Peijun finished the third idea a week later.  As a team we presented six prototypes to Jose and Ryan.  The detonate and light switch prototypes were selected to move the design forward.  Both included the wall walking controller.  A major selling point for the new prototypes was clean visuals.

During the last weeks of the semester the team spent time getting a build ready for IGF.  I spent the majority of my time jumping helping others with tasks.  I have started to solidify my role on the team as a prototype engineer to ensure ideas on the team get a fair trial.  I do not have the dread I experienced during the middle of the semester.  The team seems to be working together better.  Each engineer has organically taken a role to help the team.  The roles are not all inclusive of our work, but serve to denote the strengths we individually have brought to the team.  The roles I identify are Triston as engineering manager, Saumya as GUI engineer, Ron as architecture and systems engineer, George as build engineer, Peijun as animation engineer, Shelwin as weapon, sound and shader engineer, Sty as technical artist and myself as prototype engineer.

The engineering team has been gaining steam since the IGF submission.  The culture is improving and we can get more work done in less time.  We have more respect for each other than when we started.  With the success of prototyping I would like to continue to provide engineering expertise to move the design forward.  If anyone on the team has an idea to move the design forward, after talking to Rody, I would love to build the design to allow the team to give the idea a fair trial.

Moving forward I will be addressing issues with the camera and wall walking controller.  Re-architecting has prevented the team from building some of the suggestions from Jose and Ryan.  I would like to prototype some of the ideas including a controller where the camera isn’t controlled directly and a fixed camera at either end of the tube.  Both ideas have design problems, but deserve a fair trial.  I will be working with Rody to find good solutions to design problems with both the light switch and detonate prototypes.

Game Engineering 2 Assignment 11

Controls

Atlas – [0-9] on the keyboard will show an image for [0-9].

Light: T – Positive Z; G – Negative Z; F – Negative X; H – Positive X; R – Add Red Light; V – Subtract Red Light; Y – Add Green Light; N – Subtract Green Light;

Camera: I – Positive Z; K – Negative Z; J – Negative X; L – Positive X; U – Positive Y; M – Negative Y; Insert – Rotate in Positive Z; Delete – Rotate in Negative Z; Right Arrow – Rotate in Positive Y; Left Arrow – Rotate in Negative Y; Up Arrow – Rotate in Positive X; Down Arrow – Rotate in Negative X

Objects: WSAD – Move Object, Up and Down is q and z.

Abstract

The scope of assignment 11 was to add resolution independent sprites with a different rendering mode to the game. The shaders involved needed to allow atlasing. In addition to changing rendering modes, changing sampling state was introduced.

Aims

The aim of the eleventh assignment was to implement resolution independent sprites, mixing indexed and not indexed drawing and understanding render states.

Implementation

My implementation started with improvements to the scene file. I am working toward an architecture where the user can leave data out of the scene to initialize data to default values. A case where the approach is most noticeable is determining keys that move an object. In addition I have added handling of input and movement controller in the scene file. The scene file currently contains all the actors in the game.

The scene file is being shown.  Each actor is surrounded in curly braces.  The type is specified and handled by the World at run-time when creating in game objects for the scene.  The name is the actor name in the game.  Since I presently only allow one light source the name Light is fitting.  The location is a vector in the XZ plane from the origin.  The rotation for a light is currently being used to determine color. The new part is the controller.  The movement part links to a controller in game that only does translation by a fixed velocity.  The input controller determines the action taken for input. The controls have changed so that the user should only define the directions that make sense.  For direction of the light Y doesn't make sense since the displacement in the XZ plane is the vector that is shifting the light.  That means that Y has been excluded from the movement.
The scene file is being shown. Each actor is surrounded in curly braces. The type is specified and handled by the World at run-time when creating in game objects for the scene. The name is the actor name in the game. Since I presently only allow one light source the name Light is fitting. The location is a vector in the XZ plane from the origin. The rotation for a light is currently being used to determine color.
The new part is the controller. The movement part links to a controller in game that only does translation by a fixed velocity. The input controller determines the action taken for input.
The controls have changed so that the user should only define the directions that make sense. For direction of the light Y doesn’t make sense since the displacement in the XZ plane is the vector that is shifting the light. That means that Y has been excluded from the movement.

I have defined sprites that render in screen space. I have defined a way to add sprites to the game through the scene file. The current implementation is less generic than I would like, so I will be tuning the implementation.

Sprites can be defined in the scene file.  Two sprite definitions are shown.  The first sprite is the logo and used the logo material.  The logo material has a texture that is the EAE 6320 logo.  The controller for the sprite is not defined.  When the scene file if there is no data for the controller the NoMove controller is given to the actor preventing motion during the update loop. To show that it does not need to be declared at all the second actor does not have a controller block.  The second sprite is a sprite sheet.  Presently to show individual numbers the image is atlased by an internal controller.  I will be adding a way to reference the controller from the scene file and provide specific UV coordinates on a per button basis in the future. I specify Top, Bottom, Left and Right.  As I was implementing sprites the idea of a top and left coordinate with a scaling factor came to mind.  I am currently only using the top and left coordinates and will be updating the sprite data.  Scaling the texture is not currently in the game.   Scaling is useful in my implementation because I am maintaining the size of the original texture currently.  With a scaling factor UI programmers could re-size the texture  in a data driven way. An interesting design specification of sprites is that they are resolution independent and hold position on the screen.  Having been a UI programmer I understand the importance of keeping the image not distorted and in the correct position.  A potential pitfall to my method is that the area of the screen taken by the sprite depends on resolution.  Adding flexibility for area may be a future iteration.
Sprites can be defined in the scene file. Two sprite definitions are shown. The first sprite is the logo and used the logo material. The logo material has a texture that is the EAE 6320 logo. The controller for the sprite is not defined. When the scene file if there is no data for the controller the NoMove controller is given to the actor preventing motion during the update loop.
To show that it does not need to be declared at all the second actor does not have a controller block. The second sprite is a sprite sheet. Presently to show individual numbers the image is atlased by an internal controller. I will be adding a way to reference the controller from the scene file and provide specific UV coordinates on a per button basis in the future.
I specify Top, Bottom, Left and Right. As I was implementing sprites the idea of a top and left coordinate with a scaling factor came to mind. I am currently only using the top and left coordinates and will be updating the sprite data. Scaling the texture is not currently in the game.
Scaling is useful in my implementation because I am maintaining the size of the original texture currently. With a scaling factor UI programmers could re-size the texture in a data driven way.
An interesting design specification of sprites is that they are resolution independent and hold position on the screen. Having been a UI programmer I understand the importance of keeping the image not distorted and in the correct position. A potential pitfall to my method is that the area of the screen taken by the sprite depends on resolution. Adding flexibility for area may be a future iteration.

I added new events for PIX. PIX has proven to be a fine way to debug DirectX related calls. I have been using it for all sorts of issues and feel comfortable with the features I have used.

PIX events have been created for both drawing meshes and drawing sprites.  The render state is modified based on whether meshes or sprites are being drawn.
PIX events have been created for both drawing meshes and drawing sprites. The render state is modified based on whether meshes or sprites are being drawn.
The sprites in the game are resolution independent.  The top left window is an 800 by 600 window.  The bottom window is 1600 by 600.  The leftmost window is 800 by 1600.  I didn't line them up pixel perfect, but the size is the same.
The sprites in the game are resolution independent. The top left window is an 800 by 600 window. The bottom window is 1600 by 600. The leftmost window is 800 by 1600. I didn’t line them up pixel perfect, but the size is the same.
The sprite 1 in blue on the top left side of the screen is being shown when the 1 button on the keyboard is pressed.  The 1 shown on the screen is from a sprite sheet.  The sheet has the numbers [0-9] on it.
The sprite 1 in blue on the top left side of the screen is being shown when the 1 button on the keyboard is pressed. The 1 shown on the screen is from a sprite sheet. The sheet has the numbers [0-9] on it.
The same sprite is being used, but now the texture UV's have changed to show the 6.  The 6 key is being held down.
The same sprite is being used, but now the texture UV’s have changed to show the 6. The 6 key is being held down.

Issues

I have traced the issues with my release build to the view to screen matrix using PIX. I am getting closer to resolving the issue. I have screenshots detailing the issue.

The PIX screenshot shows the information provided to the vertex shader for a cube.  The values for position are well defined.
The PIX screenshot shows the information provided to the vertex shader for a cube. The values for position are well defined.
After the vertex shader runs the values for x and y coordinates are huge.
After the vertex shader runs the values for x and y coordinates are huge.
The view to screen matrix is in register c4 to c7.  The numbers in release are not well defined.  In debug the values are between -10 and 10.  I have been looking into the creation of this matrix and will be creating a log file to figure out how to fix the issue.
The view to screen matrix is in register c4 to c7. The numbers in release are not well defined. In debug the values are between -10 and 10. I have been looking into the creation of this matrix and will be creating a log file to figure out how to fix the issue.

I also had an issue with my sprite fragment shader. In order to add alpha information the shader needs to output four floats. I was only outputting three floats. The result was a messy image. Detection of the issue involved loading the .dds file into Visual Studio and removing channels until the image matched. When the image looked the same I knew which channels I was missing in the fragment shader.

I also defined the triangles of a sprite wrong such that the triangles were overlapping with a wedge of black on the left side. The fix was easy to figure out from PIX debugging.

Time Estimate

The workload for the project was about 14 hours.

Activity Time Spent
Updating Scene File 4 hours
Implementing Sprites 4 hours
Tracking Down Bugs 4 hours
Write up 2 hour
Total 14 hours