Skip navigation

As was expected, not a lot got done over thanksgiving break, which was a nice break for everyone. The week after that, after battling the CS department, I finally got Unreal Swarm working. In case you don’t know/remember what it is, its a task distribution system, which distributes lighting builds among all machines in a swarm. What this simpy means, is that our lighting build times are going to be much MUCH faster now, which means that future level changes can be incorporated much more quickly now. Another amazing thing that happened was that we got a decal attached to the ping now, which makes it a billion times more visible, and it looks amazing. I worked with Joe and Vinod to get it working with the current ping that we have. We’ve also decided to try a more open approach to levels, and instead of linear levels one after the other, we will have levels interconnected to each other, allowing players to complete them in any order. What this means is that the current level system that we had is going to be completely replaced with a new system to accommodate this change. We added more functionality to checkpoints, and now in addition to saving the player’s last location, they also measure the player’s progress through the open level system, and decide when to unlock the last level. We’ve been working to get thee changes in, and are on track for EAE Day next week.

For this assignment we had to implement sprites in our game. Sprites can be used to display messages to the player, as well as UI elements. To do so, I implemented a sprite class. Since sprite are always drawn on top of everything, we draw them last, and with the depth buffer off. The sprites are also alpha blended with whatever is behind them, unlike meshes. To accomplish both of these, we change the render state of our device to disbale the depth buffer and enable alpha blending, then draw the sprites, and the undo the render state changes. The hardest part of this assignment was to make sure that the sprites don’t appear stretched, no matter the resolution. It took  some thinking and trying a bunch of different things before I could get that part working. Here are a couple of screenshots showing the sprites with two different resolutions:

sprite1

800×600

 

sprite2

1280×720

 

The blue numbers on screen are drawn using a spritesheet. To change the numbers, press any number key from 0-9 to see the corresponding umber show up. This currently works with only the number keys that are above the letters on a keyboard, and not with with numpad. Here are a couple of PIX screenshots to show the render state of the device when drawing a sprite:

pixsprite1 pixsprite2

 

Here is another screenshot showing the PIX instrumentation:

pixsprite3

 

Time Taken:
Reading/Understanding what to do: ~0.5 hrs
Getting the Sprite Class working: ~1.5 hrs
Adding new shaders: ~0.5 hr
Getting spritesheet working: ~0.5 hr
Making the sprites not strecht on resolution changes: ~1.0 hr

Total Time: ~4 hrs

You can download the working exe here.

Your browser may say that the file is malicious. This is because some browsers do not allow sharing executables from unknown sources. However, the file is perfectly safe to run.

This assignment was all about lighting. We had to a directional light and an ambient light to our game. Specific requirements were:

  1. Add normals to meshes, so they could interact with lights
  2. Have a directional light that is moveable
  3. Be able to change the color of both ambient and directional light

 

For the first part, we had to add the normal information to the MayaExporter, and then make sure that the normal information made it all the way from the Maya exported mesh file to our shaders. This meant that the MeshBuilder had to rebuild all meshes with normal information in them, and then we had to read in the new data from the mesh binary files, and then pass it into the vertex buffer. Once I got that working, adding the lighting was a matter of updating both vertex and fragment shaders, to be able to handle normals and add lighting information to the final screen output. Making the light moveable involved more shader constants, and adding keyboard controls to actually move it around. The controls I implemented are:

  1. I and K to move the light in and out of the screen
  2. J and L to move it left and right
  3. U and O to move it up and down

 

Since it is a directional light, there is no single source that can be seen, its just the direction which is being changed.

To change the color values of the lights, I  currently have a place in code to do it, but will soon make a change to make it read that value from an external file, so that we won’t need to rebuild the game just to change the light information. Currently it can be changed in GameScope.h, where there are three constants, shown here:

gamescope

This is what my game looks like now:

lighting

 

Here is a PIX screenshot of me debugging a single pixel on screen, and stepping through the fragment shader to see what are the values being used:

pixfragment

Time Taken:
Reading/Understanding what to do: ~0.5 hrs
Getting the MayaExporter working: ~0.5 hrs
Getting normals working: ~0.5 hr
Getting lighting working: ~0.5 hr
Making the light moveable and colors changeable: ~1.0 hr

Total Time: ~3 hrs

You can download the working exe here.

Your browser may say that the file is malicious. This is because some browsers do not allow sharing executables from unknown sources. However, the file is perfectly safe to run.

The semester’s end is drawing closer, and I honestly can’t wait for it. This has been a very long semester, with the whole IGF submission, and now EAE Day preparation for the thesis game. We got the new ping mechanic working, though it makes the game much harder in first person. While most of us don’t want to switch to third person, the feedback from other people has been 50-50. This is another decision that needs to be made soon, as it would affect all other aspects of the game. We tried another design iteration, which was hex tiles. It did not end well. We could not get the hex tiles as we wanted them, and if we did switch completely to hex tiles, then that would have meant redo-ing all our existing levels. So we tried it, and it didn’t work. Other than that, we chugging along, constantly improving various parts of the game. The weekly builds have been a lot more time consuming than what I anticipated, with me spending weekends getting builds ready, and then all of us losing a couple of hours every Tuesday to the all hands meeting. This, along with the end of semester approaching, has slowed down development a bit, but we are still on track to get everything ready by 12 Dec.

This had four parts to it:

  1. Build a plugin for Maya 2013, which would export a mesh into our mesh format that we made earlier
  2. Use the exported meshes in the our game, with certain conditions:
    1. Draw two different meshes with the same material
    2. Draw the same mesh twice with different materials
  3. Add PIX instrumentation
  4. Make a user settings file, read it in and use it in the game

 

For the first part, we had most of the code for the plugin, and had to add functionality in it to make it export the Maya mesh into our specific format. It was simple enough, and after playing around with a few est meshes, I decided to export the mesh of the main character from my thesis game, 404Sight. I was able to import in my game and able to render it easily enough, with the proper texture too! (I got the assets from one of the artists on the thesis team). I then exported a basic cube and pyramid to add to my game. This really allows us to add any sort of mesh from Maya, and saves us from having to hard code data, such as vertex positions and UVs. We can simply create a mesh in Maya and use it in game quickly enough.

For the next part, I decided to render a total of 6 meshes at the same time. One was the ground plane, one was the character mesh, and then two each of a cube and a pyramid. I drew them just around the character mesh, using a red brick and a green brick texture. In-order to replace any mesh, I can simply go to my assets folder, and replace cube.mesh, pyramid.mesh, plane.mesh or Character.mesh, rebuild them, and be able to use them in game. Again, adding calls to draw all the new meshes was a simple enough task. Here is what my game looks like now:

meshes

As for PIX instrumentation, we were getting to the point where there were a lot of DirectX calls being made, which was making it hard to scroll through all the calls in PIX to find the one I was looking for. The current PIX instrumentation lets me group function calls into collapsible nodes. This makes the PIX output much more easy to read, and I can now easily look for a specific mesh or material call.

Here is the new PIX output for the above screenshot:

Pixinstrumentation

The final part of the assignment was adding a UserSettings.ini, which allows users to change certain setting of the game. Currently these settings are width, height, and fullscreen options. This is a human readable file, since it has to be easily changeable by end users. These are the most basic of all game settings, and now instead of being hard coded into the game, they can be changed from outside the game.

 

Time Taken:
Reading/Understanding what to do: ~1.5 hrs
Getting the MayaExporter working: ~0.5 hrs
Getting multiple meshes working: ~1 hr
PIX Instrumentation: ~1 hr
User Settings: ~1.5 hrs
GIT downtime issues: 5 hrs

Total Time: ~5.5 hrs + 5 hrs(due to GIT)

You can download the working exe here.

Your browser may say that the file is malicious. This is because some browsers do not allow sharing executables from unknown sources. However, the file is perfectly safe to run.

For this assignment, we had to add textures to our game, with both the meshes being rendered with different textures. The textures were also to be a part of our pipeline, which meant that we added another tool, TextureBuilder, which would read in images in common formats(jpg, png, etc), and convert them into .dds files, which would then be used in our game. Another part to this assignment was understanding how UVs work, and incorporating that information in our Mesh data.
The TextureBuilder was simple enough(thanks to our professor). However, incorporating the texture into the game meant LOTS of small changes in a lot of files. First off, I had to ensure that the textures were actually being built (changing AssetsToBuild.lua). Then, I had to add the UV info t the mesh, which meant changing the structure of the mesh file I made in the last assignment to add UV information, then changing the MeshBuilder to read and parse that information as well. The, I had to change the structure of the vertex buffer to add UV information into it. This meant that the vertex shader was also updated, to process UV information. Finally, we had a new fragment shader, which would use the UV information to actually get the correct part of the texture and actually display on the screen. The materials were also updated, with the material now also containing which texture it would use, which meant that material parsing code was also updated to rad in the texture name. The material class also needed a way to set the texture which the fragment shader would use, and finally we had to tell the fragment shader the texture to use before drawing each mesh. As you can imagine, this assignment involved a lot of changes all over the existing codebase, and any mistake would have caused everything to stop working. Here is the result of all the above changes:
twotextures
The cube has a soccer ball texture on it, while the floor has a brick wall texture on it. Here is a PIX capture of the above:
pixtex
I have highlighted the command which tells the shader which texture to use for the next draw.

Time Taken:
Reading/Understanding what to do: ~0.5 hrs
Getting the TextureBuilder working: ~0.5 hrs
Getting Everything else working: ~2 hr

Total Time: ~3 hrs

You can download the working exe here.

Your browser may say that the file is malicious. This is because some browsers do not allow sharing executables from unknown sources. However, the file is perfectly safe to run.

The goal for this assignment was to streamline our build pipeline further. Now, instead of having hard coded mesh data in the game, we have to read it from an external file. To do this, we have to create our own mesh file format, read it in through a MeshBuilder, write it out in binary, and then finally read the binary file in our game code. The condition in this assignment was that we could only do a maximum of four read operations when reading the binary file in the game. Here is the Mesh format I made:
plane.mesh
I used a similar mesh file for the cube as well. I chose this format because it is very human readable, as well as relatively easy to parse through code. It is also easy to add new properties in the file, which allows me to extend its functionality when needed. Next, I made a MeshBuilder tool, which reads in this mesh file, and writes it out in binary. This was easily the longest part of this assignment. After I got the MeshBuilder working, reading in the binary was simple enough. Although it was possible to read the entire file with a single read operation, I decided to do it with three, as it allowed me to easily read in the different types of data I had, which were the counts (vertex and index), all the vertex data, and the index data. With this approach, I had to change very little in my existing Mesh class, and could easily incorporate the binary file.

Reading/Understanding what to do: ~0.5 hrs
Getting the MeshBuilder working: ~5 hrs
Getting the Binary file reading working: ~1 hr

Total Time: ~6.5 hrs

You can download the working exe here.

Your browser may say that the file is malicious. This is because some browsers do not allow sharing executables from unknown sources. However, the file is perfectly safe to run.

This week, we did a mini post mortem of sorts, to see how everyone felt about the way we did things upto IGF. We discussed what worked, what didn’t, and what could be improved. We ditched Hansoft completely, decided to continue doing things the way we have been. Although the faculty’s new demand to see a weekly build, as well as show off our changelog in an all-hands meeting means that we will spending more time on getting a solid changelog together, as well as spend time every week in the all-hands. We also came up with a new feature list, detailing what we want to achieve by EAE day, which is on the last day of classes. This is what it looks like:

Backlog

Most of the engineering stuff is focused around code cleanup, graphical settings, and of course bugfixing. As for the Foresight ability, a new idea was pitched, which would be similar to the active reload mechanic in many games, like Gears of War. What this means is that the foresight would expand and contract, and while its contracting, if the player uses it within a certain range, it will go out farther. The boss was also overhauled, and instead of shooting randomly in front of the player, it would now shoot only at specific spots, which gives designers more control over level design. The boss change was easy enough, but the new foresight(or ping) has been taking a lot of time. I’ve been working with Vinod to get it working, but its a hard one.

 

We submitted to IGF this week. It was a long and crazy week. I spent all of class time on Tuesday and Thursday running around, fixing bugs, helping people fix bugs, and helping getting last minute changes and features in. And then, Thursday evening, madness began to take hold. By 9PM, we had all the art in, features were working together, and everything seemed good. It was just four people left in the lab, namely me, Kyle, Tony, and Matt. Kyle and Matt were adding final touches to the trailer, adding new gameplay to it and editing it, while Tony and I were getting a build ready and testing the crap out of it to make sure everything was fine. The trailer was done around midnight, and Matt left, leaving me, Tony, and Kyle. This when we came across a bug which would cause our game to slow down massively and become unplayable. After spending a couple of hours narrowing it down, we figured out it was happening inside the boss stuff that James made. So we had him in Skype in started discussing what was happening and how to to fix it. Meanwhile, the wrapper was also giving us issues, so I called up Abhishek to see if he could help me. Did I mention that all of this being broadcasted on Twitch? Tony had a stream going, and what ended up happening was that he , and Kyle, were answering questions people had on twitch, while James was skyped in on the same computer, and i was on the phone with Abhishek sitting in front of it all. There was a lot of confusion happening, with 4 different conversations  happening at the same time. Ultimately, we squashed all those bugs, and around 4 AM, uploaded the build to IGF. Needless to say, I’ve been sleeping since then. We’ve gone through massive changes throughout these past two months, and I like to believe that we have awesome game for that. All of this wouldn’t be possible without this crazy bunch of people I work with.

Here is the awesome trailer:

As we are roaring towards the IGF deadline, it was fall break a week before submission. Since we all knew that we had  lot to do, everyone agreed that we would through the break. We continued adding more juice, and making what we already had juicier. Other big changes we did were that we overhauled the boss that James had been working on. Earlier, it would simply aim to kill the player. But now we worked the boss, and our level design philosophy along with it. Instead of killig the player, the boss would not shoot down projectiles which would convert the existing ground into slow tiles. This meant that all the levels were reworked to have no slow tiles in them. As for me, I spent the last two weeks all over the place, helping pretty much everyone on the team with anything engineering related. During whatever I had on my own machine, I would fix bugs in pretty much anything at this point. One big issue, that was unforeseen was that Joe’s shaders weren’t working in a packaged project. The shaders worked fine in editor, but when packaging a game, the simply did not show up. I spent an entire day with him, first going through his shader to see if there was some weird issue there, then going through the post processing methodology that Unreal used, and then finally changing the shaders to work with the one way I found that would let me his shaders in editor and in game. This from a guy who is not at all into graphics programming, or had any idea how Unreal shaders worked. Other things I accomplished were stopping Brenton from going all sorts of crazy with particles, and helping him with matinee stuff to get a trailer ready. Also, Unreal released another update, which included a new way to do menus. So we overhauled our existing menu system to use Unreal’s new thing, which looks much better, and allows non- engineers to be able to design menus. With less then a week left for IGF, I know I’m going to be way more busy than I’ve been so far in this program.