Category Archives: GameEngineer 2 Class

Capture the Tiles – My Final Assignment :)

Our final assignment required us to use the stuff we had learnt throughout the semester and develop a game.

I felt that my current thesis game was a good candidate for being reproduced using my game engine, since it involved some of the elements I had learnt in class (like changing tile materials when they are captured by the players).

I decided to make a much simpler version of my thesis game, wherein the players would be at the center of a hex grid, and could shoot projectiles to claim tiles. This game would use a lot of the things I had developed for my thesis game, like a hex grid and a line renderer for tracing the projectiles path.

I  started off by importing my hex grid into my engine. That required me to increase my actor memory pool by a great extent ( My game could handle hex grids 100X100 without slowing down noticeably). I also started messing with the shaders for my tiles, by tweaking their alpha values to make them look cool. I assigned an alpha value of 0 to the vertex at the center of my tile mesh, and 1 to the vertices at the edges. This resulted in an interpolated value between 0-1 for each of the pixels between them. I used these values to decide whether that pixel lay on a transparent strip or not, and output the appropriate alpha value accordingly. Here’s what the tiles look like:

View post on

I also wanted to have the tiles brightness to ebb and fade, so I declared a time constant in the fragment shader for my tiles, and set that value from my game. Based on this value, the tile would decide how bright it needs to be.

I ended up spending a lot of time on these shaders, as the results were immediately visible, and rewarding.

The next phase of my game involved a line renderer, to show the user where his projectile would land. I was initially going to use direct-X’s line primitive to render my trajectory, but once I realized that the lines couldn’t be made thicker, I decided to make my own line renderer, with each individual strip made from quads. I already had the points which would form my line. I used the coordinates from two consecutive points, offset each point in the positive and negative x direction to get four vertices, and used them to form a quad. These quads gave me a line! I experimented a little further, and decided to render alternate quads, so that the output would be a dotted line

I developed a new shader for my line, which would make it more transparent towards the edges, and opaque at the center. It would also create a small strip which would travel along the line renderer, looping back when it reached the end of the quad ( you can see it in the video below, notice the strip on the line renderer).

My engine already had a collision system, but I had to tweak it to detect collisions in three dimensions ( It uses swept collision detection 🙂 ). I created separate callback functions for my collision system, which would destroy my projectiles on impact, and change the tile colors based on the team shooting them.

I used a simple UI layout. The first part consisted of a bar which moved to the left if player  1 was winning, or to the right if player 2 was winning ( It’s color would be determined by the winning team, i.e, it would be red if the red team was winning and vice versa). The second UI element was a text box which told the user which player was to play next.

View post on

And that’s my game!

EDIT: Added a new shader! I created a fragment shader for when you convert a tile. Here’s what it looks like

The idea had been growing in my head for quite a few days, and I decided to implement it when I had some free time on a train journey home. The strips now move inwards, eventually resulting in a blank tile. Then, new strips ( of the color to which the tile has been converted) emanate from the center of the tile and move towards the edges.

You can download it here: Click here

(Unfortunately, it doesn’t work if you’re working with multiple monitors 🙁 )

Controls: Use w,s,a,d to aim your projectile. Use i,j,k,l to move the camera around. Use numpad 2,4,6,8 to rotate the camera around ( don’t forget to lock your numpad!). Use m to shoot the projectiles. Good luck, have fun 😀

Rendering Sprites!

For this assignment, we had to render sprites ( the stuff HUD’s and GUI’s are made up of)

A sprite is an image which is rendered on the screen. It consists of the most basic structure which enables us to render an image on screen- a rectangle. However, unlike other meshes in our game, the sprite will not be affected by any changes in the position or direction of the camera. It will always be the same size, and will render at the same position on the screen. The reason? We specify screen coordinates in order to render sprites. The vertex shader does not perform transformations to bring the vertex positions from model space to world space and so on. They are already in screen space to begin with.

I created a separate vertex shader for my sprites, which simply output the positions without transforming them in any way. The new fragment shader I created for rendering sprites simply sampled the necessary texture, and output the appropriate color.

I created a Sprite class for my engine, here’s how it looks



It basically had a Vertex Buffer pointer, which holds its mesh data, a Quad, which specified the dimensions of its rectangle( in my case, the top left corner of the rectangle, its width, and its height), a Material pointer, which contained its shader information, and also additional information in case it acted as a sprite sheet. I used the following sprite sheet in my game


This is what my game looks like when the current element pointed to by the sprite sheet is 1.


And heres what it looks like when the current element pointed to by the sprite sheet is 4.


The sprite class also shared a NativeResolution object, which informed it what resolution their dimensions were specified for. If the resolution changed, I would scale the Quad width and height accordingly, so that the texture size was unaltered. The top-left corner of the Quad was specified in normalized screen coordinates, and would remain the same across all dimensions.

Here’s what my sprites look like at resolution 1000 X 1000.


and here’s what they look like at 1600 X 800


( It’s may not be evident from these pictures, but their size has remained the same!)

I also changed some of my debug events for PIX, so that they looked more organized.

Here’s how the events look like now


In order to render sprites, the depth test needs to be turned off. Here’s PIX showing the disabled depth test during the draw call

Depth test

We did not use alpha values for our meshes. However, the same is not true for our sprites, and we enable alpha blending before we draw them. Here’s PIX confirming that our blend state is enabled



I also added a simple way to add sprites into the game. I call this the “spriteInformation” file. It contains information about where the sprite appears on the screen, what it’s dimensions are, and what material it uses. It also has the native resolution for which the sizes were intended.Here’s what it looks like




And that’s it for this assignment. Here’s the link to download the executable: Click Here

You can press the plus key on the keypad to move through the sprite sheet!

Assignment 10 – Adding Diffuse Lighting!

Our goal for this assignment was to add diffuse lighting to our game.

Diffuse lighting is a model of lighting in which we assume that all light incident on an object is reflected uniformly in all directions. In other words, an objects face will not be brighter ( or darker) if we view it from certain directions. As long as we can view it ( i.e, we are no standing behind it), it will appear the same. This model of lighting is much easier to simulate than specular lighting, where the angle at which the light is incident on the object determines where it is reflected.

How do we determine how well lit an object is? Surface normals! Each vertex has a normal associated with it , and it lets us know which direction is “outward” for the vertex.normals


If we have knowledge about the direction of our light as well, it is fairly easy to calculate the component of that light which will interact with our object ,using the dot product ( dot product of 2 vectors A and B is given by |A||B|cos(theta), where theta is the angle between them).

Hence, if our light source is anti-parallel to our normal, the dot product value (dot(normal, -light_direction)) will be 1, indicating that the all of the light will interact with our object.

If they were perpendicular, the dot product would give 0, indicating that the light would not interact with our object. We clamp negative values to 0 ( since negative lighting wouldn’t make sense).

How do we translate these values to the color of our object’s fragments? Uptil now, the color of each fragment was determined by the sampled texture color, the color from the vertices, and the color modifier constant. Now, we will add another factor into this calculation. We calculate the dot product of the negative of the  lights direction with the fragments normal, multiply this by the lights intensity, and factor it into the fragments color. This means that the fragment will be brightest when it is facing the light, and will gradually grow duller as it moves away from it. The intensity of the light will also determine the maximum brightness the color can attain. We also take into account some ambient light ( stray light arriving at the object after multiple reflections from surrounding object. These calculations would be too intensive to do accurately, and we just approximate ambient light). Even if an object is not receiving any light, the ambient light ensures that it is faintly visible.

Onto the actual implementation. We were already importing our meshes from maya, and had the normal data available to use. I changed my mesh format to take into account the normal data, and also had to change my mesh builder to read in the normals. I also had to change my vertex data to take into account the additional three floats for normals. The beauty of loading the mesh data as a binary file during run time is that we don’t have to change the way we load it, as it is already in the form we desire.

I added a light source class to my game, which has an intensity and a direction. This would store data for the lights I used in my game. I needed to pass normal data to the fragment shader, via the vertex shader, so that it could compute the lighting. I changed the vertex shader parameters so that they now accepted normals as well.


I transformed the normals from model space to world space in the vertex shader, and pass it into the fragment shader. I had to add additional constants to the fragment shader to account for light color and direction.fragconsts

The fragment shader would get the interpolated normals from the vertex shader, which would need to be normalized, since they were no longer unit vectors. It would then calculate the dot product and find the component of the light that would interact with the object.



This would give us the right color for the fragment, taking into account lighting!

Here’s what my fragmentShader debug code looks like in PIX. It tells you how a particular fragment on my red cube got the color it is rendering.




As you can see, the diffuse amount ( the output of the dot product) is quite low ( 0.282), indicating that the light is more perpendicular to the cube face, than anti-parallel.

I also added a scene file for this assignment, which would contain information about all the actors in my game. Here’s what it looks like


I read this file in my game code to populate the world with actors.

This lets you create actors in the game by simply changing a human-readable file. You can decide its initial position, mesh  and material. ( I will be adding more functionality as and when I need it, like velocity, and collision system status). Adding meshes, materials and actors to my game is now easier than ever! Want a new mesh to be added to an actor? Simply add it’s name to the meshList.txt, change the scene file to say that the actor uses a mesh with that name, and you’re all set! ( You’ll have to add the mesh to the assets folder, and add its name to the AssetsToBuild lua file. Maybe I should have my AssetsToBuild file generate the mesh names from my meshList.lua file? )

I also added a lua file to hold lighting information. It is very basic right now, and contains information regarding the ambient light color, and also the color of the single diffuse light in our game.Lightinginf



Another way to make life easier for people who want to tweak values inside my game, but don’t really want to look at my code 🙂

Right now, my diffuse light is located above my objects, pointing straight down. You can use the left control and left shift keys to move it along the x axis ( left control = -ve x axis, left shift = +ve x axis), and right control and right shift keys to move it along the z axis ( right control = -ve z axis, right shift = +ve z axis). The light will adjust itself in such a way that it will always point towards the origin.

That’s it for this assignment. Here’s a zip file containing my game: Click Here


Assignment 9- Building a mesh exporter for maya!

Uptil now, we had been entering the position, color, and UV values for our mesh by hand. This approach is not feasible for more complex meshes, say, a torus, or a sphere, which has way more vertices. We address that issue with this assignment.

We created a plug-in for maya ( a 3-D modelling software), which would take a mesh you create in it and export in the human readable format we designed for our game. We could then read in this format and render the mesh in our game. Exporting from maya also gave us access to a host of other vertex data ( like normals and tangents), but I only export the data which my game uses for this assignment ( namely, position, color, texture coordinates, and triangle indices).

Maya uses a right handed coordinate system, as opposed to the left handed one used by direct-x. Hence, to take that into account, I had to invert the z- position values, and also the triangle index orders, so that the data conformed to what direct x expected.

Here’s what my pyramid mesh looks like ( it’s a lot less complex/ compact than my cylinder mesh, which has a ton more vertices)




I’m currently rendering 6 meshes in my game.

-A floor mesh ( floorMesh.txt)

-A pyramid mesh( pyramidMesh.txt)

-2 cylinder mesh ( cylinderMesh.txt)

-2 cube meshes (cubeMesh.txt)

I added controllers to the game, which determine how my actors behave.

I also added a world system from my previous assignments, which keeps track of all the actors in the game ( so that I can just loop through them in the render function and draw them!). Here’s how I assign meshes and materials to my actors now, and add them to the world.



Another addition for this assignment was player settings. We now have a settings.ini file in the game directory, which ships with the game, and enables the user to change the width and height of the window, and also to toggle fullscreen mode on or off.



As long as the user enters a valid resolution, the game window will scale accordingly. However, if the dimensions are incorrect, the game will default to 800 width and 600 height.

I load in the settings file as a lua table, and go though the values independently to use in the game.

Here’s how I read the values


(Thanks to Vinod Madigeri for helping me with the IsThisWidthAndHeightSupported function, which validates the users height and width combination.

I also added custom events for PIX ( so that I can group events like settings materials etc)



Here’s what my game looks like now!


Here’s a link to download the zip file for my executable: Click Here


Assignment 8 – Adding Textures and UVs!

Till now, we had been assigning colors to our vertices individually, and letting the interpolater handle the calculation for determining the color of each fragment in between. If we wanted two particular objects which used the same Material ( i.e, the same vertex and fragment shaders) to look different, we could only do so by changing the color modifier constant in the fragment shader for that particular object. But this only allowed us to change the overall color of the object by multiplying the color of each fragment by a constant. We could not, for example, draw a smiley face on the face of the cube we’d been rendering.

That’s where textures come into play! A texture is basically a picture ( a 2-dimensional array of colors). You apply this picture to the mesh you’ve rendered. But the mesh needs to know which vertex on it corresponds to which part of the image. This is done with the help of UV maps!

UV maps help to map a 2D texture onto a 3D object. If you imagine the edge of the texture to be placed on the origin of the U-V ( or X-Y) plane,  with the other edges of the texture at (1,0), (0,1) and (1,1), each point on the texture now has a  U and a V coordinate between 0 and 1. Now, if we assign a U-V value to each of our mesh vertices, we can map these vertices to points on the texture, and that’s how we can figure out the color the particular mesh vertex needs to have.





Our goal for this assignment was to add textures to our meshes. The first step was to create a new builder tool, the “TextureBuilder”, which in my case would take .png files and output .dds files (Direct Draw Surface- A file format used by directX) into the data folder for the game.

The fragment shader is responsible for “sampling” a texture at a given U-V coordinate and returning the color at that location. These sampled values are interpolated between the vertices, just like the vertex colors. This U-V data needs to be passed to the fragment shader by the vertex shader, and to the vertex shader by us. Hence, I had to change my vertex format, and also my vertex and fragment shaders.

Here’s my vertex shader parameters list right now:


Here’s my fragment shader parameters list right now:


And here’s what my vertex declaration has changed to:


Our meshes now needed to store UV data as well. I altered my mesh file format to look as follows ( this is my new floorMesh.txt):



I also had to change the way I parsed my human-readable mesh file, and wrote it in binary to my data folder. I also had to make small changes to my read code, so that I could offset the position from where I read my indices after reading the mesh file during run-time.

I now have a texture associated with each Material, and I’ve incorporated a path to the texture in the Material files. Here’s an example


My Material class now holds a pointer to the texture it uses, and also the index of the sampler used by the fragment shader ( basically, the data I would need to set the texture before a draw call).  When you set a material now, it loads the texture in addition to the shaders.

Here’s part of my set material function, called before I render an object.




And that’s it for this assignment. Doesn’t it look pretty!




Here’s what my SetTexture() call looks like in PIX:






Link to download my current Assignment: Click Here

Controls are the same, arrow keys for moving the cube, keypad 2,4,6,8 to move the camera. Make sure the numpad is locked!

Assignment 7- Adding a Mesh file format and a Mesh Builder

We had been using hard coded vertex and index values for our meshes, but that is hardly feasible in a real game. Ideally, we would want our vertices and indices to be generated by a modeling software such as Maya, and then load the exported Mesh values into our game.

For this assignment, we had to create a human readable mesh file format which we would parse using a new Mesh builder, convert it into a binary file, and finally load the binary at runtime to use in our game. Human readability was of utmost priority, and with that in mind, I decided to use the following file format. Here’s what my Cube and Floor Mesh files look like:





I did not wish to cluster my data with letters, such as

x = -5.0, y= -1.0 ,z = -1.0,

as I felt that they hindered readability. The data seems self explanatory without the letter aids. Each mesh file would include the number of vertices in the mesh, the number of indices, the position of the vertices, and any other data, which in my case happened to be the vertex colors. Each table in the indices section corresponds to a triangle.

The use of arrays for the lua data also proved easier to parse. Here’s what the core of my MeshBuilder code looks like.


This would convert my human-readable mesh data into a compact binary file which i could use in my game.



I keep track of the number of values on the stack, so that if anything fails, I can simply pop that amount and go to the OnExit label.

I used a single read for the file ( which reads the entire file into an appropriately sized buffer), and a single write as well, which copies the binary data into the data folder of my game. I purposely tried to keep these reads/writes low, as these can bottleneck performance.

Here’s howthe memory I copy the mesh data into looks likeMemPic

Here’s how I read the data in my game ( my Mesh class does it, and creates the vertex and index buffers in the process)

Reading the data:


Once I have the mesh data read, I can simply use memcopy ( which copies a chunk of bits) and copy the vertex info into the vertex buffer! I can do the similar procedure for the index buffer ( thus saving me the trouble of iterating through the data and filling it into the buffer individually)

Filling the vertex buffer

Fill vertex buffer

Filling the index buffer


I also altered my Actor class to now hold a Mesh* and a Material*( The structures finally falling into place!). The Mesh class holds information regarding the vertex and index buffers, while the Material class holds information regarding the vertex and pixel shaders.

So now, when I initialize my game, it first goes through a list of materials and meshes( both lua files), and loads them into maps.  Whenever I create a new actor, I just have to assign it a material and a mesh from this map, and I should be good to go! This also assures that meshes aren’t duplicated for objects using the same ones, and changes in one mesh are reflected on all objects using them.

Here’s my initialize function:



How I initialize my MeshMap. ( Similar functionality for MaterialMap)





I had to change my render calls so that they used the vertex and index buffers from the mesh object in my actors ( not too difficult).

Thats it! Here’s the zip file for this Assignment: Click Here

The controls to move the cube are the same, arrow keys for the cube, and numpad 2,4,6,8 to move the camera. Make sure your numpad is locked!


The objective of our sixth Assignment was to change our asset build process from  accepting the names of all assets to be built on the  command line to accepting a file which included all the information regarding the assets to be built. We also created a new builder for our shader files, so that the program did not have to compile them at run-time, and could just load the compiled files.

The shader builder expects an additional argument on the command line when being invoked, which tells it whether the shader being compiled is a “vertex” shader or a “fragment” shader. It compiles the shader into the game directory, from where the comiled file is loaded at run-time.

For the asset build list, I decided to use the file format suggested by our professor, since I felt that it did a good job of maintaining readability and handled all the information that would be necessary for building each asset. Here’s how my file looks:


The file returned a table, with each table value containing information specific to an asset builder. This information included the name of the asset builder, the list of assets which would be built using that particular asset builder, and the source and target extensions of the assets to be built. Provision for extra arguments to be given on the command line was also made.

I had to make minor changes to my source file extensions, since I was  earlier using my GenericBuilder to copy files of varying  source extensions into the game directory, but the source extension for all assets built with the same builder had to be identical for my asset list to work. Hence, I changed the source extension of all my assets which used the GenericBuilder to “.txt”.

Here’s a zip file of my Assignment: Click Here


Assignment 5

We finally got into 3D rendering with this assignment. The change involved changing our vertex information to hold an additional float value for the Z position.



Here is an example of the vertex layout of a cube mesh. Each of these meshes holds its vertex data in model space, which is the relative location of the vertices with respect to a pivot.

The next step in getting this mesh rendered onto the screen is to get it into world coordinates. Each of the vertices will be transformed by a “Model space to World space” matrix, which converts the vertex positions to positions in the world


Our mesh is now in world co-ordinates. In order to get it onto the screen, we need to know where the camera (or eye) is placed, and which directions it is facing. We use this information to calculate the “View space to world space” matrix. The inverse of this matrix, when applied to each of the meshes in our game, transform the vertex positions of our mesh in such a way that they are now located in a world where the camera is at the origin and facing the positive z direction ( for direct X).

This makes it easy for the graphics hardware to compute which of the vertices will be visible on the screen, and which of them are located closest to the screen ( so that we do not render  vertices unnecessarily, since they will be rendered over by other meshes closer to the screen)


At the end of these transformations, we can finally render our meshes onto the screen!

Here’s a screenshot of my program rendering two meshes: A cube, and a floor


Here’s a screenshot of Pix showing debug information of my vertex shader.




You can move the cube and the camera around in the game. The cube is controlled by the arrow keys, and the camera is controlled by the Numpad keys 8,6,4 and 2 ( Make sure the numlock is turned on)

You can download a zip of my program here: Click Here

[EDIT]: Restructuring had messed up my error handling. Here’s a better build which does not crash when it doesn’t find files it needs to run( like shaders, materials etc.): Click Here

[EDIT]: I’ve added functionality from my previous programming classes into my game now. So I can implement shared pointers, memory pools, store data as actors, have a world which keeps track of all the Actors etc.! I plan to use them for my upcoming assignments 🙂


The purpose of the fourth assignment was to start dabbling in per-material and per-instance shader constants ( values which could differ between two meshes using the same material) and to create a more refined asset build system.

I added a constants table to my Material files, which would have more tables containing constants segregated by data type. For this assignment, this involved adding a single table for Float3 values:


VertexShader= “data/vertexShader.hlsl”,
FragmentShader= “data/fragmentShader.hlsl”,


Constants =
g_colorModifier = { 1.0, 0.5, 1.0 },

Here’s how the rectangle looks with the above value of g_colorModifier


Changing g_colorModifier to { 0.0, 0.5, 1.0 } gives the following result:


This was an example of a per-material constant. Different meshes could use the same material with slight variations in its constant values.

I changed the architecture of my Material class as well, so that it now maintains a std::map  for the vertex and fragment shaders which associates constant values in each shader to a handle which can be used to change these values at run time. This map is populated automatically at the start of the program, based on the values in the “Constant” table in the Material file.

The other constant value was a per-instance value, g_meshPosition_screen, which controlled the position the rectangle was rendered on screen. It would be set each time we rendered our meshes based on the position of the mesh, which in turn would be updated every frame depending on User Input. I created a Mesh class which stored a pointer to a Material associated with it, along with a Vector3 to keep track of its position.

The arrow keys can be used to change the position of the rectangle on the screen.

We also added a GenericBuilder class to our project, so that we can start moving towards special build processes for different asset types, instead of building all assets in the same way. This builder simply copies the assets to our game directory at the moment.

Here’s a zip file of my assignment: ClickHere.



For our 3rd assignment, we had to render a rectangle, and also create our own Material file format.

We moved on to IndexedPrimitives for this assignment, so that instead of explicitly specifying the position of the vertices to be rendered, we filled in the vertex buffer with the vertex information and specified the triplet of indices that would give us the vertices to draw a triangle.

Assignment 3-Saumya



The Material file would be an easy to read/edit file which would define the parameters of a particular material to be applied to a mesh in our game. For this assignment, the Material file would only contain information of which vertex and fragments to use to render the particular object. Here’s how I set up my material file.





VertexShader= “data\vertexShader.hlsl”,
FragmentShader= “data\fragmentShader.hlsl”,




This file specified the location of the vertex and fragment shaders associated with my default Material. I chose to put the shaders in a separate table in order to keep the file clean and readable ( since I expect the file to grow significantly in the future).

My Graphics class now has a std::map of Materials , which associates a string to a Material pointer. This map is populated at the start of the program, by reading through a “MaterialList.lua” file which holds the names of all the materials used in the game.

This map helps me look up the path of the .mat file associated with a particular Mesh during rendering and get the pointer to the Material object, which enables me to set the vertex and fragment shaders accordingly!

I’m currently creating and destroying a new luaState for each luafile I need to read/execute. I plan on changing this in the future.

Here’s a zip file to run my current build: Click Here