The final Game

This was our final assignment where we had to make a game using the engine that we developed over the course. I made a Frogger style game where the player is this cute little cube that has to jump the obstacles to cross the road. Their are about 3 types of obstacles in the game such as car, train and spikes. It is an endless game and the game gets over if the player collides with any of the obstacles.

The engine that we developed is basically a graphics rendering engine where we used both OpenGL and Directx. This engine contains various builders such as Material Builder, Mesh Builder, Texture Builder, Shader Builder and other helper builders to streamline the process of asset import. These builders convert all the assets to binary format which helps reduce the game size. The engine can be used to render opaque and transparent objects and can easily be expanded to create different graphics effects.                                                                                     We also built a mesh builder for Maya which allowed us to export the model data such as Vertex, Index, RGB color and UV information in a lua file. This lua mesh file is then converted to a binary format at build time and loaded in the game at run time.

Here is the screenshot of the game:

dada

I made the game endless by repositioning the world objects once they went behind the camera. Added basic AABB collision for the player and obstacles. The floor is a simple plane which contains the texture for the road, pavement and the railway tracks. The jump for the cube is using verlet equation. After colliding with an obstacle, a Game Over mesh is drawn in front of the camera. Here is how it looks:

qeq

 

During the development of this game I could understand how important was the pipeline that we developed and how important was it to include more similar tools. I could have easily added builders specifically for each Gameobject to define its position, rotation, collision data,etc but since it was a small game and also I did not get much time. I am thinking of adding them as part of my stretch goal because If I was  third person using this engine, I would want an easier way to change the objects position and its properties without diving into the code.

Over this one semester I learnt a lot of good and bad programming practices such as code segregation, namespaces, platform independence,avoiding the for loops, setting right project dependencies and a lot of nitty-gritty things that I wan’t aware of earlier. I once almost spent 3 hours figuring out a problem only to find out that I had hard coded vertex data information.

I was desperate to know how objects were rendered  which I did. I expected to learn a little more about lighting and different shaders but because of the time restriction we could not. My coding style has improved, most importantly my way of architecting it. Earlier I used to write code for myself but now I have realised that you essentially have to write code for some else so that they find it easier to edit or add something to it. Architecting a code requires a lot of thought process and good knowledge of design patterns which I have been able to grasp over the year. One has to think more and code less but what I used to do earlier was the opposite.

This course has peeked my interest in graphics therefore I have chosen another computer graphics course for the next semester. I am really looking forward to it. This has been the best semester so far.

Link to the Game ( D3D and OpenGL). Use arrow keys to move the player.

The EAE Day

EAE day is the day when all the games that are developed under EAE are showcased that includes all the cohort and capstone projects.  Since all this time we had been testing our game on the lab machines they were working all fine but it started giving us issues when we installed the game at the kiosk. It took a while but we got our game running in an hour.

A lot of people came to our booth and liked our idea a lot. They appreciated our concept and even asked us when were we going on steam! We received huge feedback and most of it was what we had already discussed and discarded. Seems like we will have revisit those design decisions again.

Since the semester has ended and almost everyone is going on a vacation, we plan to have a design meeting over skype so that we don’t waste time after coming back to school. It was fun working on this project, I hope we are able to release the game next year successfully!! 🙂

Materialize

This was our final assignment where we were required to add material to an object which would allow us to add textures to it and combine all the effect and shader related data in one file called the Material file. This material file contains the path for the effect, the uniforms that need to be set for transparency and opaque shaders, texture path and the texture uniform name.

Here is the new Material file which includes the texture data:

dadadaI have added a null terminator after each uniform names and then added my texture data at the end of these uniform names. It is succeeded by the texture count i.e the number of textures and the actual texture path. I added the texture data at the end so that I don’t need to change my existing extraction of effect path and uniform information. The orange highlighted part is the new texture data.

I added texture data to the existing material class since the data was only a small struct and I did not find the need to separate it from the material.

Here is the new OpenGL scene:

31

 

Below is the D3D scene:

d3d

 

If you look closely the Directx scene is much sharper and jagged and the OpenGL scene is smoother. This us because of the difference in sample rates on both platforms.

Here is the Link to the Game. Press WASD to move the Camera and arrow keys to move the helix object.

Materials

This was a little long assignment since I felt the requirements were not very clear and had to figure out couple of things. But it was challenging and fun. We were required to create material for our objects. Previously we assigned different effects to the objects. The reason we are now using Materials instead of Effects is so that we can pass in our custom color to the object and custom alpha to each object. This material will contain textures in the future and we could create different materials lets say wood, glass, paper, etc depending on the textures.

Following is my transparent material Lua file

31s

 

 

 

 

 

 

 

This material contains the effect path and the uniform data. This uniform data contains the uniform name, uniform value and shader type. The uniform name is required for setting the uniform values in the shader. These 2 values will  be set through our code. The shader type tells us the shader to which the uniform belongs.

Below is the binary version of the above material file.

411

The Red highlighting shows the path of the effect . The Green highlightings are the null terminators I had placed to separate data. The orange represents the count of uniform data. The violet represent the uniform structs which contains the actual data such the uniform value count and the value. Here is the structure of the Uniform Data:

31

 

 

 

 

The tUniformHandle  is null but since it is  a cont char* type of handle, it takes up 8 bytes(4 bytes for OpenGL), the Shadertypes  takes up 4 bytes and the 4 float values take 32 bytes and  valueCoutToSet takes 1 byte and the rest of the bytes are taken up for memory alignment before the NULL terminator.

The Blue represents the uniform names which will be used to create handles. These names are separated by null terminators.

The binary file for the material is same for the Debug and Release build . But the files will be different in OpenGL and D3D version because of the difference in the handle type. D3D uses the D3DdXHandle and OpenGL uses the GLint type of handle.

Following is the result of our assignment:

ere

The image contains 4 spheres, from the left, the first 2 spheres use the opaque material with different color Red and Green. The next 2 spheres contain transparent material having alpha value of 0.2 and 0.7 respectively. The Helix uses another opaque material called the default material that has white as the uniform color value so that we get the colors that I had set in Maya without any color overlays.

Link to the Game. Use WASD to move the camera and arrow keys to move the Helix.

Maya Exporter

This assignment was creating a Maya mesh exporter which exported the mesh data in the human readable lua format similar to what we had earlier made. Second task was to add transparency to the objects.

The best part about our Maya mesh builder is that it does not depend on any other projects nor does any project depend on it. It is an independent tool that can be built and used independently.

Below is the image of Maya’s plugin manager where we can see our custom plugin loaded with the name “eae6320_bhukan_rohan.mll” (Release) and “eae6320_bhukan_rohan_DEBUG.mll”(DEBUG).

aada

 

 

 

 

 

 

 

 

 

 

 

Though the maya exporter is an independent tool and creates a maya plugin dll, we can debug the plugin from the visual studio by connecting it to the Maya process. Below is the image that shows the plugin being debugged.

vsv

 

Below is our beautiful scene with Alpha , Depth testing and Face culling enabled and Depth writing disabled. The Opaque objects Ball and the Helix have Alpha disabled and Depth Testing, Depth Writing and Face culling enabled.

fsw3

 

Following are the different Render states that are contained in our Effect file. I chose to add all the render states inside one table so that all the render states were at one place. I also added Face culling to the render states which will allow me to show both the face of a polygon if needed.

d1q

 

 

 

 

 

We store all the render states in a single uint8_t variable which is a 8 bit data type. This is done using bit shift operator. Currently we have hard coded the bit positions for the render states that is, AlphaEnabled is at 0 position, DepthTest at 1, DepthWrite at 2, Faceculling at 3. In future I am thinking of adding enums that will remove the hard coding. This means that if the binary value of the render state is 1100 then the Alpha value is turned off, Depth Testing is turned off but Depth Writing and Face Culling is enabled.

We achieve this by performing the following operation:

renderStateVariable |= 1 << 0 (Enabling Apha)

renderStateVariable |= 0 << 0 (Disabling Apha)

We use the | (OR Operator) to append the states and the bit values.

Following is the binary version of the Opaque Effect

Our Transparent Effect

In the above two files, the highlighted values are the render states values . The value for  two states namely Alpha and Depth Writing is different since we don’t want transparency for the Opaque object. Depth Writing makes sure that the opaque objects are drawn based on the Depth Mask and if it is disabled, it means the object will be drawn using the painters algorithm. We use painter algorithm when we want to draw objects on another without caring about their depth values. This is so as to see the object that is drawn behind. We have placed the render state value at the end of  the file so that I can extract the value without changing the existing extraction.

It was fun working in maya and learning how to apply alpha to objects. Most of all, I liked the Maya mesh exporter and the pipeline that we have built so far.

UPDATE:

I updated the code by defining the render states so that we do not need to hard code them. This is the way I defined my render states.

432

 

 

Link to the Game. Use WASD to move the camera and Arrow keys to move the Helix.

Third Dimension

So far we have been working in 2 dimensions with  boring square and triangle on the screen. In this assignment we created 3D objects and added a camera which can move around the scene.

Here is what our 3D scene looks like:

3d

 

We achieved this 3D look by adding one more coordinate that is the Z coordinate to our existing system and using 3 transformation matrices. These 3 matrices performed special transformations, the matrices are:

  1. Local to World : Converts local coordinates of the object to the game’s world coordinate

Let me explain the difference between local coordinate and world coordinate. Imagine yourself as a 3D object, your Local coordinate is at the center of your body (Somewhere near the hips) but your world coordinate is the actual physical address in the world.Therefore to draw an object, the objects local position has to be transformed to the world coordinate.

2. World to View : Converts the world coordinate of the object to the Camera’s coordinate.

You know now the concept of world coordinate. Camera’s coordinate is basically the position of the camera in the world. In our “game”, the box’s world position is at center of the world and that of the camera is 10 units away from the box. To see an object, it has to be seen through an eye which is the camera in this case. Therefore a transformation from the world to View (Camera) coordinate.

3. View to Screen:  Converts camera’s coordinates to screens coordinate

Screen coordinate is the projection of everything that we have seen through the camera onto the Screen which is basically going to be  a 2D image.

I also added rotation to the cube.

inter

The above image displays how the box is able to intersect the floor. Ideally if we do not enable depth buffers, painters algorithm is used to draw objects, meaning the object that was drawn first goes behind the object that was drawn next, so on and so forth.  In depth buffers, each pixel’s depth is stored in the depth buffer. The depth of the pixel that is going to be drawn is compared with the pixels in the buffer and whichever has a lower value, that pixel is drawn.  We use 1 since 1 is going to be the max value in the buffer which is basically our far plane and 0 is at the near plane. We use the less than operator because we want the objects that have lower depth value to be drawn nearer to the camera.

Here is the new human readable lua mesh file for the Floor that our cube rests on.

floor

 

 

 

 

 

 

 

 

 

 

As you can see, I have just added z as the third parameter in the position table. The vertices are now 3*4 bytes.

Capture

For creating a camera, I basically created a structure of camera that stored all the camera specific values such as its Position, Rotation, FOV, Near plane, Far plane and the Aspect ratio. I felt that  Camera is a part of the Graphics project hence added it inside the Graphics namespace.

This was an interesting assignment where I learnt the concept of depth buffer and how objects in 3D were drawn.

Here is the link to the Game. Move the camera using WASD and move the cube using the arrow keys.

One Render for all

If you have been following the blog for a while, you might have realized that we started off the  project with creating platform specific code and as we more forward we are trying to make it more platform independent. In this assignment we had to create a single Render function for D3D and OpenGL. Another requirement was to make our shaders platform independent or at least partly independent.

New Render function:

renderFn

This is our new Render function which is common to both the platforms. To achieve this I created a structure of GraphicsContext which encapsulates platform specific code using preprocessor directives. The Clear function contains its platform specific code for clearing the frames to draw a new image. The RenderCommandStart and RenderCommandEnd function is specific to D3D and hence the OpenGL directive is empty. DisplayRenderBuffer contains platform specific code to display the buffers on the screen.

The OpenGL platform requires the s_deviceContext to draw the buffers, hence I created a separate  GetContext method under Graphics namespace which returned the s_deviceContext  handle.

Following is the platform independent vertex shader code.  In this I used the HLSL variable types for declaring my variables. No specific reason. I think I am more familiar to it probably since I have edited some Unity shaders which are in HLSL, hence the choice, but it was mostly unintentional.

playsa

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

For handling the dependency for the shaders on shaders.inc, I created a separate table of dependency in my lua for each asset type. This table contains the list of all the dependencies an asset may have. I could have simply written a key value pair for dependency but I made it future proof where more than one dependency could be added.

Here is my new asset table.

dap

 

 

 

 

 

 

 

It took me a while to get the dependency working since I was thinking of other ways I could add the dependency to my asset file but I think this is a pretty solid way of solving this problem. Also debugging Lua is a little irritating.

Link to the Game

Binary Shaders

In this assignment we had to design a lua Effect file which contained paths to the shaders after the game was built. Using this Effect paths, we can put any number of effects and just add paths in the Effect file without needing to pass path for every shader file in the code while compiling it. We also were supposed to add shader builder tools which compiled the shader for us based on  the platform that is D3D or OpenGL.

Here is the Effect file that i designed.

Eff

 

 

 

In this, I basically gave separate paths with keys as the type of shader and value as the path for the shader. I included “data” in the path since that is how the property sheets in my code have been set up and I did not want to mess around with it.

Below is the binary version of the Effect file.

binef

This binary file basically has a null character at the end of each path, this makes it easier to parse the file since for offsetting the next path we simply need the length of previous path and one extra character length for the null terminator.

I extracted the path of the shaders in the following way. In this I wanted to avoid using STL function to get the length of the string hence used the “for” loop.  Extracting vertexpath is straight forward since the file reader reads the file till it gets the null terminator. For getting the fragment path, we need the length of the previous path, the position of the buffer and one extra character length to skip the null terminator that is present after the first path.

patheff

 

 

 

 

I had to create separate shader builder for vertex and fragment since it  was the most simpler solution else I would have had to revamp my lua file and change the way I was parsing it. This would have taken a lot time which I wanted to avoid. The overhead for creating the 2 shader builder was not much hence I went with this option. I had almost reached halfway where I created Include.h files which contained the  enums for the type of shader and the DEBUG preprocessor but changed the decision.

We are using a separate #define function instead of checking _DEBUG directly so that we can build our shaders in a different configuration which makes it easier for us to debug the shaders even in Release mode.

Following is the comparison of  compiled shader in Debug and Release mode.

DEBUG compiled D3D vertex shader.

debugs

 

RELEASE compiled D3D vertex shader.

releases

As you can see, the size of the debug shader is much larger than the size of the release shader because D3D saves some symbols and data that will be used during debugging where as it does not do that when building for Release mode. This helps in making our Release build size of our game much smaller.

Below is the screenshot of the OpenGL platform’s Debug and Release vertex shader. The file size of Debug is slightly larger than Release because it maintains the comments for reference purposes but removes them when building for Release.

DEBUG compiled OpenGL vertex shader

debuGL

 

RELEASE compiled OpenGL vertex shader

relas1212

Overall this was a fun assignment, I did have small conundrums on the way such as deciding whether to create separate builders on the way, should I use STL to find length, should I use NULL terminator for file paths,etc. But I enjoyed solving these problems.

Link to the Game

 

Post IGF

We have finally submitted Blind Trust to IGF!! This is a huge moment for us and the biggest milestone. We tested our game with a couple of employees at Rockwell who were basically our seniors. They gave us a lot of feedback and watching them struggle while playing the game was heart wrecking. We saw the mistakes that we were doing. One of the biggest of all was testing the game within the team and not testing it outside the team.

They told us that the pings that we had put for the communication was almost useless because it got irritating for the deaf to continuously  ping since pinging was an essential compoent of the mechanic, it did not make sense to  dedicate a button to play the ping. They also told us many art assets did no match up the overall low poly art style. We will be discussing these things with the team and talk about what needs to be tweaked.

Move Around

I had almost anticipated that I would have to do a lot of work in this assignment dude to my laziness in the previous one. In this assignment we had to basically move the mesh by altering the offsets of the vertex shader. Most of the time was spent in figuring out how to make multiple meshes render and how to alter its position from the Game.

I created a struct of Renderable and a list which contained this Renderable.Using this list i was able to render meshes easily. . The Renderable struct wraps the Mesh, Effect and Position Offset. Though to access this list I had to use the extern keyword since the list was a part of Graphics project and I had to access it in the Game project. Figuring this out took time.

For the future assignments I might have to encapsulate the Renderable in  a Gameobject class since that is what we had done in Joe’s class and seems to be the right approach.

The square in the center can be controlled using the arrow keys LEFT, RIGHT, UP and DOWN.

dad

We move our meshes using uniforms in the vertex shader, the same effect can be achieved by altering Vertex data but it would be costly since we would be redrawing the mesh every frame. Altering uniforms is the cheapest way to move a mesh or rather offset the mesh position.

Link to the Game.