Capture the Tiles – My Final Assignment :)

Our final assignment required us to use the stuff we had learnt throughout the semester and develop a game.

I felt that my current thesis game was a good candidate for being reproduced using my game engine, since it involved some of the elements I had learnt in class (like changing tile materials when they are captured by the players).

I decided to make a much simpler version of my thesis game, wherein the players would be at the center of a hex grid, and could shoot projectiles to claim tiles. This game would use a lot of the things I had developed for my thesis game, like a hex grid and a line renderer for tracing the projectiles path.

I  started off by importing my hex grid into my engine. That required me to increase my actor memory pool by a great extent ( My game could handle hex grids 100X100 without slowing down noticeably). I also started messing with the shaders for my tiles, by tweaking their alpha values to make them look cool. I assigned an alpha value of 0 to the vertex at the center of my tile mesh, and 1 to the vertices at the edges. This resulted in an interpolated value between 0-1 for each of the pixels between them. I used these values to decide whether that pixel lay on a transparent strip or not, and output the appropriate alpha value accordingly. Here’s what the tiles look like:

View post on imgur.com

I also wanted to have the tiles brightness to ebb and fade, so I declared a time constant in the fragment shader for my tiles, and set that value from my game. Based on this value, the tile would decide how bright it needs to be.

I ended up spending a lot of time on these shaders, as the results were immediately visible, and rewarding.

The next phase of my game involved a line renderer, to show the user where his projectile would land. I was initially going to use direct-X’s line primitive to render my trajectory, but once I realized that the lines couldn’t be made thicker, I decided to make my own line renderer, with each individual strip made from quads. I already had the points which would form my line. I used the coordinates from two consecutive points, offset each point in the positive and negative x direction to get four vertices, and used them to form a quad. These quads gave me a line! I experimented a little further, and decided to render alternate quads, so that the output would be a dotted line

I developed a new shader for my line, which would make it more transparent towards the edges, and opaque at the center. It would also create a small strip which would travel along the line renderer, looping back when it reached the end of the quad ( you can see it in the video below, notice the strip on the line renderer).

My engine already had a collision system, but I had to tweak it to detect collisions in three dimensions ( It uses swept collision detection 🙂 ). I created separate callback functions for my collision system, which would destroy my projectiles on impact, and change the tile colors based on the team shooting them.

I used a simple UI layout. The first part consisted of a bar which moved to the left if player  1 was winning, or to the right if player 2 was winning ( It’s color would be determined by the winning team, i.e, it would be red if the red team was winning and vice versa). The second UI element was a text box which told the user which player was to play next.

View post on imgur.com

And that’s my game!

EDIT: Added a new shader! I created a fragment shader for when you convert a tile. Here’s what it looks like

The idea had been growing in my head for quite a few days, and I decided to implement it when I had some free time on a train journey home. The strips now move inwards, eventually resulting in a blank tile. Then, new strips ( of the color to which the tile has been converted) emanate from the center of the tile and move towards the edges.

You can download it here: Click here

(Unfortunately, it doesn’t work if you’re working with multiple monitors 🙁 )

Controls: Use w,s,a,d to aim your projectile. Use i,j,k,l to move the camera around. Use numpad 2,4,6,8 to rotate the camera around ( don’t forget to lock your numpad!). Use m to shoot the projectiles. Good luck, have fun 😀

EAE day!

With EAE day approaching, we needed to get a decent build ready so that we could have people playtest it. Ron had rearchitected some of our code, and our Lights Out build was ready to go. But we wanted to have another idea playtested at the EAE day- our Detonate mode. It involved firing a projectile which would travel along the tube. You were in charge of when to trigger it, causing it to explode. This would permanently get rid of a small portion of the tube. You had to make your opponent fall out of the tube a set number of times to win.

We had to recreate this mode in our rearchitected code, which proved harder than expected. We were in the lab uptil 1.30 a.m trying to get it to run. We finally managed to get it ready, and went home for a temporary respite.

The next day, when I arrived at the venue for EAE day, we realized the controls for our builds were really faulty, and I immediately got to work on them along with Skip. We were able to get a build with acceptable controls ready, and available to our producers, so that they could load them up onto the computers which would be used for demonstrating our game.

EAE day was a amazing. A host of people had come to see our games, and I even met an industry pro who had worked on sound for games such as Dota 2, and successful movies like Avatar. Our game was well received by the audience, but it had noticeable camera issues, which I plan to fix ASAP.

That’s it for this semester. I foresee a super busy final semester coming up 🙂

Rendering Sprites!

For this assignment, we had to render sprites ( the stuff HUD’s and GUI’s are made up of)

A sprite is an image which is rendered on the screen. It consists of the most basic structure which enables us to render an image on screen- a rectangle. However, unlike other meshes in our game, the sprite will not be affected by any changes in the position or direction of the camera. It will always be the same size, and will render at the same position on the screen. The reason? We specify screen coordinates in order to render sprites. The vertex shader does not perform transformations to bring the vertex positions from model space to world space and so on. They are already in screen space to begin with.

I created a separate vertex shader for my sprites, which simply output the positions without transforming them in any way. The new fragment shader I created for rendering sprites simply sampled the necessary texture, and output the appropriate color.

I created a Sprite class for my engine, here’s how it looks

sprite

 

It basically had a Vertex Buffer pointer, which holds its mesh data, a Quad, which specified the dimensions of its rectangle( in my case, the top left corner of the rectangle, its width, and its height), a Material pointer, which contained its shader information, and also additional information in case it acted as a sprite sheet. I used the following sprite sheet in my game

numbers

This is what my game looks like when the current element pointed to by the sprite sheet is 1.

sheet1

And heres what it looks like when the current element pointed to by the sprite sheet is 4.

sheet4

The sprite class also shared a NativeResolution object, which informed it what resolution their dimensions were specified for. If the resolution changed, I would scale the Quad width and height accordingly, so that the texture size was unaltered. The top-left corner of the Quad was specified in normalized screen coordinates, and would remain the same across all dimensions.

Here’s what my sprites look like at resolution 1000 X 1000.

1000X1000

and here’s what they look like at 1600 X 800

1600X800

( It’s may not be evident from these pictures, but their size has remained the same!)

I also changed some of my debug events for PIX, so that they looked more organized.

Here’s how the events look like now

Pixeve

In order to render sprites, the depth test needs to be turned off. Here’s PIX showing the disabled depth test during the draw call

Depth test

We did not use alpha values for our meshes. However, the same is not true for our sprites, and we enable alpha blending before we draw them. Here’s PIX confirming that our blend state is enabled

BlendingTest

 

I also added a simple way to add sprites into the game. I call this the “spriteInformation” file. It contains information about where the sprite appears on the screen, what it’s dimensions are, and what material it uses. It also has the native resolution for which the sizes were intended.Here’s what it looks like

spriteinfo

 

 

And that’s it for this assignment. Here’s the link to download the executable: Click Here

You can press the plus key on the keypad to move through the sprite sheet!

Camera concerns

With our new direction in mind, we decided to keep the environment simple and geometric. We had to concentrate on the more important issues, like camera and character controls.

I was tasked with improving the camera. Skip had done a pretty good job of having the camera adjust its distance from the player based on whether it was obstructed by a tile. However, it was a little abrupt. I changed the code a little so that the camera would zoom in quickly, but would zoom out at a slow rate, so that the camera didn’t look too jittery.

Assignment 10 – Adding Diffuse Lighting!

Our goal for this assignment was to add diffuse lighting to our game.

Diffuse lighting is a model of lighting in which we assume that all light incident on an object is reflected uniformly in all directions. In other words, an objects face will not be brighter ( or darker) if we view it from certain directions. As long as we can view it ( i.e, we are no standing behind it), it will appear the same. This model of lighting is much easier to simulate than specular lighting, where the angle at which the light is incident on the object determines where it is reflected.

How do we determine how well lit an object is? Surface normals! Each vertex has a normal associated with it , and it lets us know which direction is “outward” for the vertex.normals

 

If we have knowledge about the direction of our light as well, it is fairly easy to calculate the component of that light which will interact with our object ,using the dot product ( dot product of 2 vectors A and B is given by |A||B|cos(theta), where theta is the angle between them).

Hence, if our light source is anti-parallel to our normal, the dot product value (dot(normal, -light_direction)) will be 1, indicating that the all of the light will interact with our object.

If they were perpendicular, the dot product would give 0, indicating that the light would not interact with our object. We clamp negative values to 0 ( since negative lighting wouldn’t make sense).

How do we translate these values to the color of our object’s fragments? Uptil now, the color of each fragment was determined by the sampled texture color, the color from the vertices, and the color modifier constant. Now, we will add another factor into this calculation. We calculate the dot product of the negative of the  lights direction with the fragments normal, multiply this by the lights intensity, and factor it into the fragments color. This means that the fragment will be brightest when it is facing the light, and will gradually grow duller as it moves away from it. The intensity of the light will also determine the maximum brightness the color can attain. We also take into account some ambient light ( stray light arriving at the object after multiple reflections from surrounding object. These calculations would be too intensive to do accurately, and we just approximate ambient light). Even if an object is not receiving any light, the ambient light ensures that it is faintly visible.

Onto the actual implementation. We were already importing our meshes from maya, and had the normal data available to use. I changed my mesh format to take into account the normal data, and also had to change my mesh builder to read in the normals. I also had to change my vertex data to take into account the additional three floats for normals. The beauty of loading the mesh data as a binary file during run time is that we don’t have to change the way we load it, as it is already in the form we desire.

I added a light source class to my game, which has an intensity and a direction. This would store data for the lights I used in my game. I needed to pass normal data to the fragment shader, via the vertex shader, so that it could compute the lighting. I changed the vertex shader parameters so that they now accepted normals as well.

vertexSh

I transformed the normals from model space to world space in the vertex shader, and pass it into the fragment shader. I had to add additional constants to the fragment shader to account for light color and direction.fragconsts

The fragment shader would get the interpolated normals from the vertex shader, which would need to be normalized, since they were no longer unit vectors. It would then calculate the dot product and find the component of the light that would interact with the object.

fragLig

 

This would give us the right color for the fragment, taking into account lighting!

Here’s what my fragmentShader debug code looks like in PIX. It tells you how a particular fragment on my red cube got the color it is rendering.

DebugCap

cubeDebug

 

As you can see, the diffuse amount ( the output of the dot product) is quite low ( 0.282), indicating that the light is more perpendicular to the cube face, than anti-parallel.

I also added a scene file for this assignment, which would contain information about all the actors in my game. Here’s what it looks like

scenefile

I read this file in my game code to populate the world with actors.

This lets you create actors in the game by simply changing a human-readable file. You can decide its initial position, mesh  and material. ( I will be adding more functionality as and when I need it, like velocity, and collision system status). Adding meshes, materials and actors to my game is now easier than ever! Want a new mesh to be added to an actor? Simply add it’s name to the meshList.txt, change the scene file to say that the actor uses a mesh with that name, and you’re all set! ( You’ll have to add the mesh to the assets folder, and add its name to the AssetsToBuild lua file. Maybe I should have my AssetsToBuild file generate the mesh names from my meshList.lua file? )

I also added a lua file to hold lighting information. It is very basic right now, and contains information regarding the ambient light color, and also the color of the single diffuse light in our game.Lightinginf

 

 

Another way to make life easier for people who want to tweak values inside my game, but don’t really want to look at my code 🙂

Right now, my diffuse light is located above my objects, pointing straight down. You can use the left control and left shift keys to move it along the x axis ( left control = -ve x axis, left shift = +ve x axis), and right control and right shift keys to move it along the z axis ( right control = -ve z axis, right shift = +ve z axis). The light will adjust itself in such a way that it will always point towards the origin.

That’s it for this assignment. Here’s a zip file containing my game: Click Here

 

Assignment 9- Building a mesh exporter for maya!

Uptil now, we had been entering the position, color, and UV values for our mesh by hand. This approach is not feasible for more complex meshes, say, a torus, or a sphere, which has way more vertices. We address that issue with this assignment.

We created a plug-in for maya ( a 3-D modelling software), which would take a mesh you create in it and export in the human readable format we designed for our game. We could then read in this format and render the mesh in our game. Exporting from maya also gave us access to a host of other vertex data ( like normals and tangents), but I only export the data which my game uses for this assignment ( namely, position, color, texture coordinates, and triangle indices).

Maya uses a right handed coordinate system, as opposed to the left handed one used by direct-x. Hence, to take that into account, I had to invert the z- position values, and also the triangle index orders, so that the data conformed to what direct x expected.

Here’s what my pyramid mesh looks like ( it’s a lot less complex/ compact than my cylinder mesh, which has a ton more vertices)

 

PyramidMesh


 

I’m currently rendering 6 meshes in my game.

-A floor mesh ( floorMesh.txt)

-A pyramid mesh( pyramidMesh.txt)

-2 cylinder mesh ( cylinderMesh.txt)

-2 cube meshes (cubeMesh.txt)

I added controllers to the game, which determine how my actors behave.

I also added a world system from my previous assignments, which keeps track of all the actors in the game ( so that I can just loop through them in the render function and draw them!). Here’s how I assign meshes and materials to my actors now, and add them to the world.

GameCPP

 

Another addition for this assignment was player settings. We now have a settings.ini file in the game directory, which ships with the game, and enables the user to change the width and height of the window, and also to toggle fullscreen mode on or off.

SettingsFile

 

As long as the user enters a valid resolution, the game window will scale accordingly. However, if the dimensions are incorrect, the game will default to 800 width and 600 height.

I load in the settings file as a lua table, and go though the values independently to use in the game.

Here’s how I read the values

LoadingPIC

(Thanks to Vinod Madigeri for helping me with the IsThisWidthAndHeightSupported function, which validates the users height and width combination.

I also added custom events for PIX ( so that I can group events like settings materials etc)

 

PixPic

Here’s what my game looks like now!

GameAss9

Here’s a link to download the zip file for my executable: Click Here

 

Lights Out!

I prototyped my idea with the help of a hex grid last weekend. It was really interesting to study the coordinate systems you could use with a hex grid, as I had to figure out logic which would know if two tiles were connected on the hex grid.

I ended up storing two lists, which had information about all the tiles currently controlled by the player. Whenever a player captured a new tile, I would check for connections with existing tiles, and if I found one, I would grant all the connecting tiles to the player. I would also remove these tiles from the other players list. The hex grid system looked neat, and the mechanic seemed fun to me 🙂

We met in class on Tuesday and showed off our prototypes. Everyone had done a terrific job, and though people liked my idea, we all came to the conclusion that skips “Lights Out” idea was simple, and effective, and should be the way we move forward.

The idea was to give the users the ability to turn their tiles on and off. This gave the painting a lot more meaning.

This meant that we would be scrapping a LOT of our current game. Most of the environment art, the weapons system and the UI would have to be thrown away. To say that I was scared was an understatement. We were almost starting afresh, but I believe it was a necessary step in order to make our game fun.

 

Assignment 8 – Adding Textures and UVs!

Till now, we had been assigning colors to our vertices individually, and letting the interpolater handle the calculation for determining the color of each fragment in between. If we wanted two particular objects which used the same Material ( i.e, the same vertex and fragment shaders) to look different, we could only do so by changing the color modifier constant in the fragment shader for that particular object. But this only allowed us to change the overall color of the object by multiplying the color of each fragment by a constant. We could not, for example, draw a smiley face on the face of the cube we’d been rendering.

That’s where textures come into play! A texture is basically a picture ( a 2-dimensional array of colors). You apply this picture to the mesh you’ve rendered. But the mesh needs to know which vertex on it corresponds to which part of the image. This is done with the help of UV maps!

UV maps help to map a 2D texture onto a 3D object. If you imagine the edge of the texture to be placed on the origin of the U-V ( or X-Y) plane,  with the other edges of the texture at (1,0), (0,1) and (1,1), each point on the texture now has a  U and a V coordinate between 0 and 1. Now, if we assign a U-V value to each of our mesh vertices, we can map these vertices to points on the texture, and that’s how we can figure out the color the particular mesh vertex needs to have.

TriangleUV

!UvRepresent

 

 

Our goal for this assignment was to add textures to our meshes. The first step was to create a new builder tool, the “TextureBuilder”, which in my case would take .png files and output .dds files (Direct Draw Surface- A file format used by directX) into the data folder for the game.

The fragment shader is responsible for “sampling” a texture at a given U-V coordinate and returning the color at that location. These sampled values are interpolated between the vertices, just like the vertex colors. This U-V data needs to be passed to the fragment shader by the vertex shader, and to the vertex shader by us. Hence, I had to change my vertex format, and also my vertex and fragment shaders.

Here’s my vertex shader parameters list right now:

VertexParam

Here’s my fragment shader parameters list right now:
FragmentDecl

 

And here’s what my vertex declaration has changed to:

VertexDec

Our meshes now needed to store UV data as well. I altered my mesh file format to look as follows ( this is my new floorMesh.txt):

FloorUVs

 

I also had to change the way I parsed my human-readable mesh file, and wrote it in binary to my data folder. I also had to make small changes to my read code, so that I could offset the position from where I read my indices after reading the mesh file during run-time.

I now have a texture associated with each Material, and I’ve incorporated a path to the texture in the Material files. Here’s an example

NewMatfile

My Material class now holds a pointer to the texture it uses, and also the index of the sampler used by the fragment shader ( basically, the data I would need to set the texture before a draw call).  When you set a material now, it loads the texture in addition to the shaders.

Here’s part of my set material function, called before I render an object.

LoadMatfunc

 

 

And that’s it for this assignment. Doesn’t it look pretty!

 

Gamepic

 

Here’s what my SetTexture() call looks like in PIX:

 

 

 

Pixsettex

 

Link to download my current Assignment: Click Here

Controls are the same, arrow keys for moving the cube, keypad 2,4,6,8 to move the camera. Make sure the numpad is locked!

Assignment 7- Adding a Mesh file format and a Mesh Builder

We had been using hard coded vertex and index values for our meshes, but that is hardly feasible in a real game. Ideally, we would want our vertices and indices to be generated by a modeling software such as Maya, and then load the exported Mesh values into our game.

For this assignment, we had to create a human readable mesh file format which we would parse using a new Mesh builder, convert it into a binary file, and finally load the binary at runtime to use in our game. Human readability was of utmost priority, and with that in mind, I decided to use the following file format. Here’s what my Cube and Floor Mesh files look like:

CubeMeshFile

 

FloorMeshFile

 

I did not wish to cluster my data with letters, such as

x = -5.0, y= -1.0 ,z = -1.0,

as I felt that they hindered readability. The data seems self explanatory without the letter aids. Each mesh file would include the number of vertices in the mesh, the number of indices, the position of the vertices, and any other data, which in my case happened to be the vertex colors. Each table in the indices section corresponds to a triangle.

The use of arrays for the lua data also proved easier to parse. Here’s what the core of my MeshBuilder code looks like.

Meshbuildernew

This would convert my human-readable mesh data into a compact binary file which i could use in my game.

meshnew2

 

I keep track of the number of values on the stack, so that if anything fails, I can simply pop that amount and go to the OnExit label.

I used a single read for the file ( which reads the entire file into an appropriately sized buffer), and a single write as well, which copies the binary data into the data folder of my game. I purposely tried to keep these reads/writes low, as these can bottleneck performance.

Here’s howthe memory I copy the mesh data into looks likeMemPic

Here’s how I read the data in my game ( my Mesh class does it, and creates the vertex and index buffers in the process)

Reading the data:

ReadingMeshdata

Once I have the mesh data read, I can simply use memcopy ( which copies a chunk of bits) and copy the vertex info into the vertex buffer! I can do the similar procedure for the index buffer ( thus saving me the trouble of iterating through the data and filling it into the buffer individually)

Filling the vertex buffer

Fill vertex buffer

Filling the index buffer

Fillingindexbuf

I also altered my Actor class to now hold a Mesh* and a Material*( The structures finally falling into place!). The Mesh class holds information regarding the vertex and index buffers, while the Material class holds information regarding the vertex and pixel shaders.

So now, when I initialize my game, it first goes through a list of materials and meshes( both lua files), and loads them into maps.  Whenever I create a new actor, I just have to assign it a material and a mesh from this map, and I should be good to go! This also assures that meshes aren’t duplicated for objects using the same ones, and changes in one mesh are reflected on all objects using them.

Here’s my initialize function:

InitializeFunc

 

How I initialize my MeshMap. ( Similar functionality for MaterialMap)

 

MeshMap

 

 

I had to change my render calls so that they used the vertex and index buffers from the mesh object in my actors ( not too difficult).

Thats it! Here’s the zip file for this Assignment: Click Here

The controls to move the cube are the same, arrow keys for the cube, and numpad 2,4,6,8 to move the camera. Make sure your numpad is locked!

New Directions?

During our stand up meeting in class post IGF, we started talking about how our game seems really busy, with a lot of things happening on screen. Also, our painting mechanic did not have a strategy associated with it, and that was making our game dull.

We discussed for a long while about how we could incorporate mechanics which could make the game better. It took us the best part of two classes to shortlist a few ideas which we would like to prototype.

I had proposed a mechanic in which, if the user paints two tiles which are connected horizontally/ vertically/diagonally, he also gets all the tiles in between. There were around 4 other ideas pitched as well. We decided to let the people who had pitched the ideas prototype them, and have them ready by the following week.

I really love prototyping new ideas, and I’m looking forward to prototyping my idea this weekend