Game Engineering 2 Final Project

During this final project we had to make a game using all the techniques and knowledge that we acquired over the course of the semester. This course focused on graphics to teach necessary engineering concepts, even though its not a graphics course. I had decided to make a game that showcases graphics related techniques that we learned.

Mine is a shooting game. Player can move the camera in around the screen along the width and height of the screen (XY plane). If an object is in the center of the screen they get shot. Center of the screen is indicated with a cross hair. Every time an object get shot its transparency level increases by 0.1 from 1.0 to 0 and becomes invisible. To make the game little more challenging the objects will be moving around the scene.

The game looks like as shown in the following screenshot.final-game1

When objects get shot their transparency increases as follows.final-game2

Initially I wanted make the camera move around the scene something similar to a traditional first person shooter, then I faced many issues implementing it because of time constraints. This is something I am planning to finish in the future. Changing transparency during run time was a simple task to achieve and the results were fun to watch. I have used sine function from cMath library to move the objects in the scene. Showing some kind of visual effect when an object got will be helpful, this is something I learned while I was presenting this game. I will add this feature.

This class has been very interesting overall. As I have mentioned we got to learn some basics of graphics to understand concepts of engineering. This is like learning something while putting practical use of it. Earlier I could not have preferred a graphics, now I would love to sit in a graphics class. Things that I learned other than graphics are making use of binary files to use while running the game. This is something that would use in future whenever I get a chance. We were also got used to work in multi platform environment. Making things platform independent as much as possible was very informative in many ways.

You can the download the game from the following link.

Game-Release-Build-D3D

Game-Release-Build-OpenGL

Run game.exe after extracting the downloaded zip file.

Controls:
1. WASD to move the camera.
2. ‘space’ to shoot.
3. Escape to quit the game.

Game Engineering 2 Assignment_13

This assignment is about adding textures to the game objects. Texture related information is provided through materials. Shaders take this information and apply textures to the game objects with the help of meshes.

Following screenshot shows texture data inside material binary file.texture-binary-data

The red color box indicates number of textures used in the material. Green indicates sampler name used inside the shader. Blue indicates the path to the texture. I have stored both sampler name and texture path as strings with a null terminator. I have stored the number of textures earlier to the data so that I know exactly how many sets of strings I have to read. I do not want to disturb material data so I have appended texture data at the end of the binary material file.

I have added a new class for texture instead of adding it to the material class. I wanted to make material class platform independent as much as possible, since texture is platform dependent I have created a new class for that to make things look cleaner.

Output of the game in OpenGL is as follows:textures-opengl-output

Output of the game in Direct3D is as follows:textures-d3d-output

If we observe both images we will find that the image looks smother in opengl compared to direct3d, especially the helix in the center.

Estimated time to finish this assignment was nearly 9 hours.

You can the download the game from the following link.

Game-Release-Build-D3D

Game-Release-Build-OpenGL

Run game.exe after extracting the downloaded zip file.

Controls:
1. Arrow keys to move the box.
2. WASD to move the camera.
3. Escape to quit the game.

EAE open house

We had EAE open house couple of days back where there were many guests like last year including students from other departments, industry professionals and general public.

EAE open house hosted C5 thesis games, C6 prototypes, capstone games and games developed by Gapp lab. Many people like our game. The new tutorial was helpful. And we got some awesome feedback. Planning to meet during break to discuss on this and proceed onto adding new stuff.

Here are some pics during EAE open house.

eae-open-house1

eae-open-house2

Game Engineering 2 Assignment_12

This assignment is about creating materials for the objects. Earlier we used to change the visuals of the game objects on the screen through Effects directly. Now this is done through Materials.

Materials mainly controls the looks of the game objects. Effects does more. Every time we have to change the color of the object we don’t want to create a new effect because we might end up duplicating lot of functionality related to shaders. Instead we add a new feature to game objects called Material, that communicates with an effect and changes the looks of the objects. Multiple materials shares same effect to achieve different results.

Following is a screenshot of a human readable Material lua filematerial-lua-file

This Material has two uniform data. One is RGB color which is blue and other is alpha modifier that sets transparency level as 0.7. Both are fragment shader’s uniforms. RBG value sets the color that we want on this material. Alpha modifier sets the transparency level that we want. This material also stores path to an Effect that it uses.

I have created a Material Builder to read this lua file and generate a binary Material file during build time. I have used the following structure to represent Uniform Data.uniformData-struct

The order in which I chose to write to the binary file is: Effect path, Uniform Data count, all uniforms data, all uniforms names. I have added NULL characters after Effect path and each uniform name which will help when reading from binary file.

Following is a screenshot of the binary Material file in DIrectX3D.material-binary-file

This is the binary file of the above lua file. The count inside red box is the number of uniforms. In this case it is 2 as shown in the lua file. The number inside the green box indicates the type of shader that each uniform is using. 1 indicates fragment shader, in both cases. The bytes inside the orange box indicates the uniform handle which is 0 as I have set it to NULL, I am updating it during run time. The values in blue box are the names of those 2 uniforms.

There is no special reason in choosing Effect path first while writing into binary file. Then I have added a NULL terminator so that I can extract the path easily. Then I have added the number of uniforms, so that I can know how many uniform data sets are ahead. Then uniform data sets. After that uniform names with a NULL character each, again for extraction purposes. Note that the order of the uniform names should match the order the uniform data sets. This is a standard order that I thought of based on earlier binary representations.

Following is the screenshot of the binary material file in OpenGL.material-binary-file-opengl

There is difference in size of the binary file in D3D and openGL. This is mainly because of the size difference of the handle in those 2 configurations. This is evident from the above 2 images. In D3D version checkout bytes in orange box, in openGL bytes in blue box.

If we pick either OpenGL or D3D, there will be a slight difference between Debug and Release builds regarding binary material files. This is because of the different garbage values filled for unused slots in ‘values’ array as shown in the structure above.

Finally here is a screenshot of the output of the game after adding materials.assignment12-output

There are four spheres in the scene. All are made from same sphere mesh (which has default white color). Those four spheres have four different materials. The red and green ones are opaque. The blue and yellow are transparent with different level of transparency.

Estimated time to finish this assignment was nearly 9 hours.

You can the download the game from the following link.

Game-Release-Build-D3D

Run game.exe after extracting the downloaded zip file.

Controls:
1. Arrow keys to move the box.
2. WASD to move the camera.
3. Escape to quit the game.

Game Engineering 2 Assignment_11

In this assignment we create a plugin for Maya to export meshes to be used in the game. The other requirement is to make objects in the game transparent by setting different render states during run time.

MayaMeshExporter generates plugin for Maya to export meshes. This project does not depend on any project and no other projects depends on this. Maya mesh exporter is an independent project that generates meshes for the game in the format that we specify.

Following is the screen shot of the loaded Maya plugin.maya-plugin-manager

Following is the screen shot of debugging the plugin.maya-debugging

Following is the screen shot of the output.output_11

Following is the screen shot of the lua effect file for the opaque effect with render states.opaque-effect-lua

There are four render states used in the above file. All the four are them are optional. They are bool values. When passing those values to the binary file we can use one bit for each, so we need only 4 bits to specify those four values. I have used an unsigned 8 bit integer to pass those 4 bools, so that if I want more than 4 in future.

Following are the screen shots of the binary effect files both opaque and transparent effects, along with their render states from the human readable render files.

binary-effectopaque-render-states

binary-transparent-effecttranparent-render-states

First one is opaque effect and second one is transparent effect. The highlighted 8 bits represent render states. For opaque the corresponding binary for the render states would be ‘0000 1110’, and for transparent it would be ‘0000 1011’ which are represented as 0E and 0B respectively. I have added the render states after the paths to both shaders. With this way I don’t have to change anything related to reading the shader paths from the binary files.

The A in RGBA represents transparency which ranges from 0 to 1 or 0 to 255. D3D and Open GL uses 0 to 1 representation. 0 is complete transparency and 1 is full opaque. To display a transparent object correctly we need to consider the color of the background. We have to blend the colors to get the final effect. We use the following formula to calculate the resulting effect.
(transparent_color * alpha) + (background_color * (1-alpha))
alpha is the transparency of the transparent object. To make this work correctly we need to draw transparent objects from back to front because every time we draw an object we consider the color of the background. To let graphics know about this we set the render states.

Estimated time to finish this assignment was nearly 7 hours.

You can the download the game from the following link.

Game-Release-Build-D3D

Run game.exe after extracting the downloaded zip file.

Controls:
1. Arrow keys to move the box.
2. WASD to move the camera.
3. Escape to quit the game.

Game Engineering 2 Assignment_10

This assignment is about drawing 3D objects on the screen. Also displaying multiple objects on the screen with correct depth based on their location. The user should also control the camera along with the 3D box.

Following is the screenshot of the output.3dObject

To achieve this we should use 3 transformation matrices. Local to World, World to View and View to Screen.

Local To World:
All the meshes that we get will be in a local coordinate system meaning the position of the their vertices will be relative to a local origin on them. When we import them onto the game world we need Local to World transform so that we can place them wherever we want relative to world coordinate system.

World To View:
We have to see the objects in the world through camera view. So we would be able to see only those objects that the camera view covers. World To View gets the position of all the objects relative to camera position and orientation. With this we can know all the objects that the camera covers.

View To Screen:
After knowing the objects that camera covers we need to project them on to the screen. View To Screen will help here considering the field of view and, near and far plane (the range in which we want display). We basically project all those objects on to a 2D screen every frame.

Following is the screenshot of the box intersecting the floor.box-intersecting-floor

In the above screenshot we have clear idea of box position with respect to that floor. Earlier we used to draw one on top of other which does not work in this kind of scenario. We need to use Depth Buffers, and this is how it works. The depth buffer ranges from 0.0 to 1.0, inclusive. During  View to Screen transformation, instead of projecting onto a 2D plane we project onto a 3D plane which has a thickness of buffer range, so that we also have the relative depth geometry of all meshes in the view. Before drawing every frame we clear the depth buffer of the screen to 1.0, then when we draw each mesh, we would verify the depth buffer of the possible pixel from the mesh with the corresponding pixel on the screen, if it is less than or equal to the one on the screen we draw it or we skip it. 0.0 indicates near to the screen and 1.0 is farthest. This is reason why we clear the depth buffer to 1.0 when we clear the frame and we verify if the possible pixel is less than or equal to the one on the screen.

Following is the screenshot of the human readable lua file of floor mesh.floor-luaMesh

If we notice the position inside each vertex, we can see there also z-coordinate that was missing before. Now the position is like (x, y, z).

Following is the screen shot of the binary floor mesh.floor-binMesh

The highlighted two 4 bytes indicate number of vertices and number indices.Earlier the size of each vertex was 12 bytes now it is 16 bytes as we have added another coordinate which is a float. If we observe in the above screenshot after first 2 lines, every 4 lines indicate each vertex, they have a similar pattern till offset 44, then there are indices. The first 3 lines in each vertex indicates 3 coordinates and the 4th line indicate the color.

I have created a new class called CameraObject inside my Graphics because I feel it should belong to graphics inside Engine. I have given access to control its position so that we can change its position from the Game.

Estimated time to finish this assignment was nearly 7 hours.

You can the download the game from the following link.

Game-Release-Build-D3D

Run game.exe after extracting the downloaded zip file.

Controls:
1. Arrow keys to move the box.
2. WASD to move the camera.
3. Escape to quit the game.

Rockwell feedback

We have showcased our game during an event hosted by Rockwell Collins and got some really helpful feedback.

We decided to do some minor changes to some mechanics like changing manually pinging from the deaf player to dropping a beacon that makes sound to guide the blind player.

Following is the summery of the feedback that we received.

rockwell-feedback

Game Engineering 2 Assignment_09

This assignment is about making Render function in Graphics platform independent. Also making part of the shaders platform independent with the help of a shaders include file.

Following screen shot shows the platform independent Render function.

 

graphics_render

There are four functions inside Render, ClearFrame, StartFrame, EndFrame and ShowFrame. These four functions hides platform dependent functionality. ClearFrame clears previous frame. StartFrame and EndFrame exists only in D3D. ShowFrame draws the updated frame.

Following is the screenshot of the vertex shader some platform independence.

 

vertex_shader

This file includes shaders.inc which contains some important things to make this platform independence possible. g_position_offset is common for both D3D and OpenGL. The input and output variables are declared based on the platform (except that OpenGL doesnt use output position). These has to be platform dependent. The output position calculation using offset position is platform independent as shown above. But I had to use platform dependent checking as the name of the output variable is different in both cases.

The following screenshot shows AssetsToBuild.lua file showing how I have implemented dependencies that assets might need.

assetsToBuild_dependecy

For both vertex and fragment shaders the dependencies table has shaders.inc file. BuildAssets.lua will check the list of dependencies and check their status and update them along with all those assets that are depending on these dependencies.

Estimated time to finish this assignment was nearly 4 hours.

You can the download the game from the following link.

Game-Release-Build-D3D

Run game.exe after extracting the downloaded zip file.

Controls:
1. Arrow keys to move rectangle.
2. Escape to quit the game.

Game Engineering 2 Assignment_08

In this assignment we have to add two more tools to the project, EffectBuilder and ShaderBuilder. Effect builder will build the path to the shaders during run time and generates a binary file with paths to both fragment and vertex shaders. Shader builder will build the vertex and fragment shaders during build time and generate binary files with content of the shader files.

The process of making Effect builder is same as Mesh builder except now we send only path to be built through a lua file. Shader builder is complex compared to this and the source code is given as a part of the assignment.

Following is the human readable lua effect file.

luaEffect_file

We have to send the paths to vertex and fragment shader through this file. I have made it simple without adding anything extra like number of characters as another entry in the table. Because no one likes to count characters in the string every time they update the path.

Following is the binary effect file.

binEffect_file

I have added NULL character at the end of both the paths when I write to the binary file. I have highlighted the first NULL which is 00 in Hex. Even the last hex character is NULL. On the right side the readable version of this is displayed, as this is just a string.

During run time those NULL characters will help to extract the individual paths from the binary. The content of the binary file is copied into a temporary buffer. Path to vertex shader is extracted by using address of this temporary buffer thanks to the first NULL character. Fragment shader path is extracted by calculating the address of the next character to the first NULL character in the temporary buffer.

I have used two different shader builders for vertex and fragment. I have chose this method because I did not want to do changes to AssetsToBuild.lua file as it will disturb that structure that it has. I might have to convert the ‘builder’ entry in that file to take another value which decide if it is vertex or fragment shader. Also I would have to edit BuildAssets.lua to consider the changes described before.

We have defined separate debug and release modes for shaders. This is done to debug shaders when the complete game is in release mode. We want to check the results of the shaders when the game is in its optimized state that is in release mode.

Following are the comparisons of the binary vertex files in opengl/d3d and debug/release

D3D – debugbin_vertex_shader_d3d

D3D – releasebin_vertex_shader_release_d3d

In the release mode of D3D vertex shader there are many optimizations. We can notice that there are no variable names that are there in the debug mode.

OpenGL – debugbin_vertex_shader_debug_opengl

OpenGL – releasebin_vertex_shader_release_opengl

The major differences between opengl debug and release are the comments. The release mode does not have the comments. Comments in the debug mode helps to understand things easily when we are debugging. We get to know things clearly.

Estimated time to finish this assignment was nearly 5 hours.

You can the download the game from the following link.

Game-Release-Build-D3D

Run game.exe after extracting the downloaded zip file.

Controls:
1. Arrow keys to move rectangle.
2. Escape to quit the game.

IGF Submission

Finally the day comes. We worked till 5am in the morning. and submitted our finished build to IGF. We finished all the stuff that we planned to implement. During the last day we continuously worked while the producers are continuously testing the game to make sure everything was working.

And finally we finished the IGF build and submitted it. We had celebrated the moment and went home to get some rest.

Here is the gameplay video of IGF submission build.