Game Engineering || – Final Project

For the final project we had to make our game using the engine that we have we developed over the course and use most of the methodologies and techniques that we have been taught over this course.

Our engine contains both OpenGL and Direct3D configurations and so your can build and render your game on any of the two platforms. We have a cool Maya Mesh Exporter where you can export the model directly from Maya into your Game Assets and use those models for your game. We have Materials where you can adjust alpha transparency, RGB values as well as add textures. The engine also contains different builders like MeshBuilder, AssetBuilder, ShaderBuilder, MaterialBuilder and more whose work is to convert the assets to binary formats. Binary formats have great advantages of reduced size over the original mesh files.

So the game I decided to create is an Exploration Game where you control a character and you can basically interact with the objects on the island as shown in the picture below:

bandicam 2015-12-18 17-54-58-352

The main mechanic of the game is that when you come in contact with an object, its color gets affected. As you see in this picture below, when the main character comes in contact with house the color changes to a deep blue.

bandicam 2015-12-18 17-55-17-756

After interacting with all the objects their final colors will look like shown in the picture below. So basically I am comparing the position of the character with the objects on the island and checking whether the character is withing the same x-z plane as the object. The way I am switching the color is that when the character is within the same x-z plane as the object, I am swapping the current material of the object with a new Material which is already loaded. The reason I decided to go with the swap is because at the time I did not know if I wanted to change the effect (transparent or opaque), or color (RGB) or whether I wanted to change the texture. So when the new material allowed me to change the things and see which one looked better. So ultimately for objects on the island I am just changing RGB values and for the sky I am changing the texture.

So one of the mistakes I was initially making was adjusting the positions in Maya and getting them in the game. So checking whether the character was touching other objects in the island did not work correctly. So later I exported the model without adjusting their positions and centered in Maya to the Assets and adjusted their position in the Engine Graphics. This made the character object interaction work correctly. So looks like even if you move around objects in Maya and export them their centers do not move .

bandicam 2015-12-18 17-55-27-767

There are other ways in which you can influence the environment. Basically if you press left shift the sky at the back turns to a sunset texture.

bandicam 2015-12-18 17-55-35-465

Finally, if you keep “0” pressed on the NumPad on your right, a fish will popup and go back as you keep the button pressed. The fish moves up and down in a sin wave which is done by taking the sin of the total seconds elapsed and multiplying that with a height to adjust the amplitude of the wave. The same principle applies to moving the water up and down in a wavering sinusoidal fashion.

bandicam 2015-12-18 17-55-47-789

The things I have learned is firstly how to make code platform independent and writing and reading binary files. I did not have much clue about what a binary file does in a game. So reading it was totally cryptic for me. But John Paul really explained the concepts about it really well and how to read and interpret the binary files, which will be very useful for my future projects.

Other things that I have learnt is how to design and architect code. I never payed much attention to designing code and organizing code but through this course I have learnt that it is very important. You can learn Graphics, Gameplay or AI but learning how to integrate them and read between the lines is important and is something that I have learned in the class and need to work on it more. I learnt about decoupling the code, abstraction , platform independence and most importantly the simplicity of lua files, so that you can understand it and there is a good amount of clarity.

My biggest take away from this class is the Asset Build  pipeline system specially using the Maya Mesh Exporter. On a whole this class has been a enjoyable experience where we focused our attention to the asset build system and also on the basics of Graphics. Many graphics books will explain you tricks but the things we have learned in this class is barely found in books. Such courses are really rare to find, and I will definately look back to the things I have learned in this class for future projects.

The Gameplay Video of my Game is as follows:

Gameplay Controls –

Use arrow keys to control character
1) Left And Right Arrow Key for moving in x axis.
2) Up And Down Arrow Key for moving in z axis.
3) Press X to rotate clockwise.
4) Press C to rotate anticlockwise.

Press Left Shift to Change to Cloudy Sky to Sunset.

Press “0” on the right NumPad and keep it pressed to see the Fish move up and Down.

Click the following link to download the game.

FinalProject

Estimated Time for completion – 20 hours.

Game Engineering || – EAE 6320 – Assignment_13

The main objective of this assignment is to add Textures to our models from Maya. Basically our material file will contain the path for the texture file which will be either in png or jpeg format.

Every texture will have their own texture coordinates which are called u,v coordinates. We need to add a uniform variable called sampler which will basically sample the texture coordinates which are exported from Maya.

In the Maya Exporter earlier we only had a pos and a color, but now we need to update the Maya Exporter so that it can include the textures. Basically we need update the Maya Exporter to add the u,v texture coordinates so that they can be exported to the Mesh files from Maya . We will also need to add the Texture Builder tool to process the texture files in the asset folder.

So the textures need to be associated with the Material Files so that we can add the textures to the material files and the material files itself would allow us to basically set the textures, effects etc. So basically the material file would allow us to make the game objects opaque or transparent, change the rgb color values and now add textures.

The following picture basically shows the blue transparent material lua file. Basically as discussed above the texure_data is addition to the file which contains the “sampler” and the path of the texture file.

texture

After adding the textures the binary file looks like as follows:

bandicam 2015-12-17 21-51-21-814

Basically after the null terminator of the uniform name, we see a “01” which basically indicates the count. So the material file is using only one texture and so the texture count is 1. Followed by the texture count is the texture path . It was easier to append the texture data after the  effect path and  uniform data since I didn’t need to change the existing effect path and uniform data information. I thought about adding the textures in materials rather than creating new struct for Textures but it was cluttering my materials so I basically added a new struct for the texture to prevent the cluttering of code .

Direct 3D Output

Direct3D

OPEN GL OUTPUT

OpenGL

If you basically notice the two outputs you will see that the Direct3D output is a but more sharper around the edges than the OpenGL output. This is because of the difference in the sampling rates of both the platforms.

Estimated Completion Time – 16 hours

The controls for camera are First Person Shooter controls as follows:
A –     to strafe left
D –    to strafe right
W –   to move forward
S –     to move backward
Space – To fly upward
Z –    to Dive Downward
E –    rotate Camera ClockWise
Q –   rotate Camera AntiClockwise

C-    rotate the man around the house in clockwise direction to guard          the house
X-   rotate the man around the house in anti-clockwise direction to             guard  the house

Use arrow keys to control the mans movement

Link To download :

Direct3D_Release_13

After downloading the link run Game.exe.

Semester 3 Thesis – EAE Day Open House

Today is the day where the cohort 5 and cohort 6 will showcase their games. Everyones excited for the feedback that we will get from EAE Day.

It was a great day for us. The people liked the overall game. Was very pleased with the most of the feedback based on the wand weapon system and the environment sound system . We are proud of our effort to produce a good game for IGF.

The best part of the day was getting the BB8 Star Wars cake. Cant wait to watch the movie on the 18th.

cake

Semester 3 Thesis – 8

Finally I have finished the last task for the projects for this semester. The bee with my particles looks like as follows:

bee

The bee model was created by Avinash which looks quite cool.

So since initially the blind player has no clue on the mechanics, I programmed a fire fly behavior which rotates 90 degrees at a time in an orbital motion around the blind player and emits fire triangle particles. The fly basically has an audio source. Only the deaf player can see the bee but the blind player cannot. But the blind player can hear the sound of the bee. As soon as the blind player identifies the sound direction of the bee the bee rotates by 90 degrees. These events are programmed to occur 4 times during the tutorial. This prepares the blind player and helps him learn the mechanic during the tutorial.

Game Engineering || – EAE 6320 – Assignment_12

Overall this assignment was really long and I found it pretty challenging since the instructions were not as straightforward as the other assignments. The order of the requirements were not given sequentially so it wasn’t clear whether to start from the top or the bottom. However, it was worth it to figure it out and learnt a lot.

The main objective of this assignment is create the material system in the game. So earlier the color of the mesh was based on the vertex colors that were in the mesh files. So if we had to add a different color to mesh we would have to create a new mesh file with different colors. Now in this assignment we can use the same meshes and use different colors with our materials that we have created.

Capture

The screenshot of my human readable file looks like as shown above.
The pathEffect is the relative path of the effect file. The UniformData holds different uniform information.
UniformName indicates the uniform name.
The ShaderType indicates the type of shader whether it is vertex or fragment shader . The values indicate the values of uniform .

Hex

The above picture shows the binary version of the effect_transparent.lua file which is the blue transparent material’s binary file.

The red highlight shows the path of the effect file. The yellow highlight shows 02 which indicates the number of uniforms which is 2. The pink highlight shows the uniform names . Since this is a transparent material it will the colorRGB as well as the alpha modifier. The green highlights are the null terminators. The blue highlight is basically the struct of the uniform data.

If this were a opaque shader then it would not have the alpha modifier . The picture below shows the binary file for an opaque shader which does not contain the alpha modifier. So you can only see the colorRGB for the case of opaque material.

hes opaque

The output is shown below.

It contains 4 box meshes which is using 4 different materials. The first two which is the red and the green are opaque and hence have no transparency in them. The last two are yellow and blue which are transparent materials. The yellow has the transparency of 0.3 while the blue has the transparency of 0.7 . Hence yellow box looks more clearer than the blue. The way you affect the transparency is by making the alpha modifier value for yellow box to be 0.3 and for the blue box to be 0.7.

bandicam 2015-12-07 00-00-33-977

I was getting a weird bug while doing this assignment. The bug was like a cracked screen and looked as follow :

bandicam 2015-12-08 16-56-59-875

Initially I though that I was not writing or reading the lua file properly but after looking at the binary files and then debugging the values that was not the problem. The problem turned out to be a simple one . I was not setting the path of the effect file due to which I was getting this weird pattern. But it took me quite a while to figure out this problem. Another problem that I faced was that my s_direct3dDevice was becoming null. The solution for that turned out to be that I was creating the material before CreateDevice() . So after moving that code for creating material , after the CreateDevice() was called, the device was not longer null. This again took me a long time to fix.

You can download the Zip File of Assignment_11 from the following
link to get the square output.

The controls for camera are First Person Shooter controls as follows:
A –     to strafe left
D –    to strafe right
W –   to move forward
S –     to move backward
Space – To fly upward
Z –    to Dive Downward
E –    rotate Camera ClockWise
Q –   rotate Camera AntiClockwise

C-    rotate the man around the house in clockwise direction to guard          the house
X-   rotate the man around the house in anti-clockwise direction to             guard  the house

Use arrow keys to control the mans movement

Direct3D_Release_12

After downloading the link run Game.exe.

 

 

Game Engineering || – EAE 6320 – Assignment_11

The main objective of this assignments is to create a Maya Exporter which is used to export a maya model in the human readable format. We can then use this human readable format and load it into the game. We also create a transparency effect for which we create render states for opaque and transparency effect.

So the work of the MayaMeshExporter is to convert the model in Maya to the human readable format. For that we have to write a function called WriteToMeshFile() according to our human readable format and this will convert the Maya model into our format.

As far as the dependency goes, the MayaMeshExporter is independent i.e. it does not depend on any other project in the solution. This is because its sole purpose is to convert the Maya model into the human readable format. You can pretty much just export the Maya model and build this MayaMeshExporter alone and independently and it will generate the human readable format for that particular model which is a great feature to have.

The following image shows the Plug-in Manager in Maya 2016. You can also see the two custom names. For my Release version the name is “eae6320_mankoo_hardit.mll” and for my DEBUG version the name is “eae6320_mankoo_hardit_DEBUG.mll”.

bandicam 2015-11-16 22-13-23-117

My scene below contains 4 meshes:

  1. House Mesh
  2. Man Mesh
  3. Floor Mesh
  4. Sphere Mesh

bandicam 2015-11-16 22-28-18-887

My output is as shown in the image above. You can see the transparency effect on the sphere which is between the man and house. Earlier I had given the same color for the triangle faces of the house and the man but it was looking like a flat a plane. So to give the model some depth I varied the color on the faces with the lightest colors on the left and darker colors on the right to give an effect as if there is a point light on the left of the scene. This process would be much simpler when we learn texture coordinates.

So basically the man, the house and the floor are opaque that is they have an opaque effect . This means that their alpha_transparency is disabled. Their settings are as follows:
alpha_transparency = false;
depth_testing = true;
depth_writing = true;
face_culling = true;

The sphere on the other hand which lies between the man and the house has the alpha transparency and the depth writing enabled. Due to this the sphere looks semi transparent and every object that lies behind it would be visible but would have the semi transparent effect for the objects that lie behind it and are covered by the radius of the sphere. And the objects that are in front of it will not have the semi transparent effect . They transparency settings are as follows:
alpha_transparency = true;
depth_testing = true;
depth_writing = false;
face_culling = true;

My effect.lua file looks like as follows . As you can see I have chosen to add all the render states in one table under “render_states” because it makes sense to club those 4 boolean values which represent the render states under one table. It makes it easier for me to understand.

bandicam 2015-11-16 22-42-55-605

Basically we store the the render states in a uint8_t variable. The uint8_t is a 8 bit data type. Even from those 8 bits we only use 4 rightmost bits which correspond to alpha_transparency, depth_testing, depth_writing and face_culling. Hence uin8_t is actually big enough and dont need to consider uint16_t so and so on.
So earlier I had harcoded the values but from the discussions in class it makes sense not to hardcode them and associate a name for each. The advantage of associating a name is that we done need to remember which bit represents which render state.
For this I just defined 4 names as follows:
bandicam 2015-11-16 23-09-38-943

I just used the same order that was given in the assignment page, i.e. alpha_transparency represents 0, depth_testing represents 1, depth_testing represents 2 and face_culling represents 3. Our render state is basically a combination of 4 values which are
Alpha Transparency, Depth Testing, Depth Writing and Face Culling. All of these are boolean which allows us to enable of disable them.
To enable or disable these values we use bitwise operations. So say we have our render_state_value and we want to enable the alpha transparency and we want to enable the depth testing . For this we need to do a bitwise OR operation so that we can set or enable the rightmost two bits as follow:

render_state_value|= 1<<0 . This enables alpha transparency.
render_state_value|= 1<<1 . This enables  depth testing.
So now our render_state_variable looks like 0011.

My opaque effect looks as follows:
bandicam 2015-11-16 23-45-30-165
My transparency effect looks as follow:
bandicam 2015-11-16 23-44-58-120

This is how my binary files look for the opaque and transparency effect.
If you look at the highlighted value for opaque effects it shows “0E” which is 0xE which is 1110 . The information we get from this is that alpha transparency which is the rightmost bit is disabled, depth testing which is the next one is enabled, depth writing is enabled and face culling is enabled. So the information is correct as the opaque objects have no transparency and matches the values in our human readable effect file.
Similarly if you look at the transparent effect, the highlighted value shows “OB” which is 0xB which is 1011. This means that the alpha transparency , the depth testing and face culling are enabled but the depth writing is disabled. This again matches our human readable file.

For debugging the meshes in Maya I created a primitive test model called BigBox.mesh .  So if you see the screenshot of the debugging below you can see in the i_fileName the name of the asset which is being debugged which is BigBox.mesh . It also provides the vertex buffer and the index buffer information for the model which is a very cool feature .

Debug Maya 11

Overall this assignment took me 17 hours to finish .

You can download the Zip File of Assignment_11 from the following
link to get the square output.

The controls for camera are First Person Shooter controls as follows:
A –     to strafe left
D –    to strafe right
W –   to move forward
S –     to move backward
Space – To fly upward
Z –    to Dive Downward
E –    rotate Camera ClockWise
Q –   rotate Camera AntiClockwise

C-    rotate the man around the house in clockwise direction to guard          the house
X-   rotate the man around the house in anti-clockwise direction to             guard  the house

Use arrow keys to control the mans movement

Direct3D_Release_11

After downloading the link run Game.exe.

Semester 3 Thesis – 7

Finally we have submitted the game to IGF at 5:30 am in the morning ! Everyone is burned out but we feel great and proud as a team . Here is the gameplay video of our game Blind trust.

So we are approaching the end of the semester. The only thing that needs to be updated now is the tutorial level. From the gameplay the tutorial level does not seem to be that intuitive enough for now. So that would be the final part that we will be working on this semester.

So basically my task now is to code the behavior of an insect that looks like a bee with some particles emitting from the tail.  He will orbit around the blind player in a sin wave fashion. There will be sound source on the bee and as soon as the blind player points to the sound source, the bee will move to another point around the player and the player will have to identify the sound again. This process will occur 4 times.

Basically the task list is as follows. My name is next to the tasks.

pasted_image_at_2015_12_03_11_58_am_1024

Game Engineering || – EAE 6320 – Assignment_10

The main objective of this assignment was to draw a 3D cube to the screen. We basically converted our previous 2D looking square to a cube by making a couple of changes. Along with cube we also made a camera through which we can maneuver in the game.

The 3D cube looks like the picture below:

bandicam 2015-11-09 21-50-30-599

The first important thing we do to convert 2D to 3D is to add the third dimension z-axis to the sVertex struct. We start by adding a z coordinate to our lua mesh file and update the MeshBuilder to read the z value.

We are also creating a frustum with several features like the Field Of View which describes the angle within which we can see the world. We have a near and far clipping plane. Any object outside this range is not visible.

Now the other important thing is that instead of we need to perform a MVP matrix transformation which is model view projection. Initially all out meshes exists in a Local Coordinate System.
Now a local coordinate system is a system where the object has its own local center and its orientation and coordinates will be relative to it own center or pivot.

To know where this object exists in global space or the world we have to perform a Local To World matrix transformation. By doing the objects orientation and coordinates now become relative to the world.

Now that the object exists in the world, we need to bring the object in view of the camera. We need to know whether the camera view covers the object or not . So we need to orient and transform its coordinates relative to the camera and this is called World To View
transformation. We do this because if the object exists behind the camera in the world then it will not be within the camera view and hence will not be visible.

Like a painter draws 3D objects on a 2D plane, we now need to find the projection of the objects visible to the camera, on the 2D screen . For this we do the View to Screen transformation.

bandicam 2015-11-09 21-50-30-599

We also need to do an important thing i.e. enable the depth buffer. Earlier our depth buffer was false and did not store any depth information. Essentially what was happening before was that the order in which we were drawing objects, we were just stacking them on top of each other. So the last object drawn would be the most visible. So we use the Painter’s algorithm approach before which was inconvenient.

But now we do the the depth test. The depth value ranges from +1.0 to -1.0 where 1.0 is the deepest depth. The depth buffer basically stores the depth of the pixels. So the depth of the pixels are compared to ones in the depth buffer and the lower the value that pixel gets drawn. Hence lower the depth value the closer that pixel will be to our near plane, and higher the value(1.0 being the maximum) the further away that pixel will be from the screen. So we are basically clearing the depth buffer to 1.0 because 1 is the max value for the depth and we use a less than operator because we want to draw those pixels which are closer to the camera.

My floor mesh looks like as follows:
bandicam 2015-11-10 00-44-11-410
So from the picture there is now a z axis in the mesh file. Due to that the size of each vertex is now 16 bytes because of 4 bytes for each axis plus the color.

bandicam 2015-11-10 00-43-28-972
So my Camera exists in the Graphics because i felt it had more graphics related stuff than gameplay. My camera has several movement functions. I also have a camSpeed which can be adjusted to move the camera faster or slower.I have given several controls to the camera . The list of controls are mentioned near the download section. I figured the rotation part for the Camera. Took me some time to figure out. Actually I was not multiplying with the ElapsedTime due to which it was rotating very fast at Computer speeds and not it real time seconds. But it works properly now. I also added the rotations to the Cube. The cube rotation seems to be a bit orbital for now.The key bindings are given at the download section.

In the picture below I was just trying to draw some basic primitive 3D shapes in order to understand the winding order properly.

bandicam 2015-11-08 01-04-00-822

Time Taken – 18 hours.

Here’s a recording of the Camera and Cube Controls.

You can download the Zip File of Assignment_10 from the following
link to get the square output.

The controls for camera are First Person Shooter controls as follows:
A – to strafe left
D – to strafe right
W – to move forward
S – To move backward
Space – To fly upward
Z – To Dive Down
E – Rotate Camera ClockWise
Q – Rotate Camera AntiClockwise

C-Rotate Cube Clockwise
X-Rotate Cube AntiClockwise
Use arrow keys to control the cube movement

Direct3D_Release_10

After downloading the link run Game.exe.

Game Engineering || – EAE 6320 – Assignment_09

The main objective of this assignment was to make the Render() platform independent. So as you can see the picture  below I have split the Render in for Direct3D and OpenGL into Clear(), BeginDraw(), EndDraw() an ShowBuffer().

So there is basically no platform specific code within this Render function . It is platform independent. The implementation or the details would obviously differ depending on the platform. So the Clear() function basically clears the screen to black. We have the BeginDraw() and EndDraw() for the D3D implementation . For OpenGL I am just returning true for both the functions.

bandicam 2015-11-03 17-14-16-108

Below is the screen shot of my vertex shader . Had to take two screenshots so its half and half because it didnt fit the screen.

bandicam 2015-11-03 17-18-48-408

bandicam 2015-11-03 17-18-18-464

The vertex shader includes the shaders.inc which makes things more platform independent. I have also made the output position calculation platform independent. For calculating the output positions , I am however using the #if defined for OpenGL and Direct3D . The g_offset position is again common for both the OpenGL and Direct3D platforms.

My AssetsToBuild.lua screenshot is as shown in the image below where I have added the dependencies.

bandicam 2015-11-03 17-35-58-712

It took me around 6 hours to finish this assignment.

You can download the Zip File of Assignment_09 from the following link to get the square output. Use the arrow keys to move the red square.

Direct3D_Release_09

After downloading the link run Game.exe.

Semester 3 Thesis – 6

So everyone is working hard for the IGF now. To give an update of the particle hit effect, through play testing the effects seem too realistic for now. It is not matching with the theme of the game so far.

So I have decided to work with the unity’s particle system and script to behavior to generate one from scratch. It will basically use the mesh which is provided by the artists or the texture that is provided by the artists. So far I came up with these effects .

The one in the middle is using the triangle textures will be used for the arrow hitting the wall. The particles will have the gravity effect and fall down in a cube shape and the collision with the obstacle is detected. I am using texture from the low poly prototyper pack from unity asset store. Later the artists will provide their own textures.

Picture1

The following video shows the particle effect.

So far I am very happy with the results since it is low poly and cartoonish and goes with the theme of the game. It seems that the from the team feedback this effect is finalized.