Monthly Archives: December 2014

Capture the Tiles – My Final Assignment :)

Our final assignment required us to use the stuff we had learnt throughout the semester and develop a game.

I felt that my current thesis game was a good candidate for being reproduced using my game engine, since it involved some of the elements I had learnt in class (like changing tile materials when they are captured by the players).

I decided to make a much simpler version of my thesis game, wherein the players would be at the center of a hex grid, and could shoot projectiles to claim tiles. This game would use a lot of the things I had developed for my thesis game, like a hex grid and a line renderer for tracing the projectiles path.

I  started off by importing my hex grid into my engine. That required me to increase my actor memory pool by a great extent ( My game could handle hex grids 100X100 without slowing down noticeably). I also started messing with the shaders for my tiles, by tweaking their alpha values to make them look cool. I assigned an alpha value of 0 to the vertex at the center of my tile mesh, and 1 to the vertices at the edges. This resulted in an interpolated value between 0-1 for each of the pixels between them. I used these values to decide whether that pixel lay on a transparent strip or not, and output the appropriate alpha value accordingly. Here’s what the tiles look like:

View post on

I also wanted to have the tiles brightness to ebb and fade, so I declared a time constant in the fragment shader for my tiles, and set that value from my game. Based on this value, the tile would decide how bright it needs to be.

I ended up spending a lot of time on these shaders, as the results were immediately visible, and rewarding.

The next phase of my game involved a line renderer, to show the user where his projectile would land. I was initially going to use direct-X’s line primitive to render my trajectory, but once I realized that the lines couldn’t be made thicker, I decided to make my own line renderer, with each individual strip made from quads. I already had the points which would form my line. I used the coordinates from two consecutive points, offset each point in the positive and negative x direction to get four vertices, and used them to form a quad. These quads gave me a line! I experimented a little further, and decided to render alternate quads, so that the output would be a dotted line

I developed a new shader for my line, which would make it more transparent towards the edges, and opaque at the center. It would also create a small strip which would travel along the line renderer, looping back when it reached the end of the quad ( you can see it in the video below, notice the strip on the line renderer).

My engine already had a collision system, but I had to tweak it to detect collisions in three dimensions ( It uses swept collision detection 🙂 ). I created separate callback functions for my collision system, which would destroy my projectiles on impact, and change the tile colors based on the team shooting them.

I used a simple UI layout. The first part consisted of a bar which moved to the left if player  1 was winning, or to the right if player 2 was winning ( It’s color would be determined by the winning team, i.e, it would be red if the red team was winning and vice versa). The second UI element was a text box which told the user which player was to play next.

View post on

And that’s my game!

EDIT: Added a new shader! I created a fragment shader for when you convert a tile. Here’s what it looks like

The idea had been growing in my head for quite a few days, and I decided to implement it when I had some free time on a train journey home. The strips now move inwards, eventually resulting in a blank tile. Then, new strips ( of the color to which the tile has been converted) emanate from the center of the tile and move towards the edges.

You can download it here: Click here

(Unfortunately, it doesn’t work if you’re working with multiple monitors 🙁 )

Controls: Use w,s,a,d to aim your projectile. Use i,j,k,l to move the camera around. Use numpad 2,4,6,8 to rotate the camera around ( don’t forget to lock your numpad!). Use m to shoot the projectiles. Good luck, have fun 😀

EAE day!

With EAE day approaching, we needed to get a decent build ready so that we could have people playtest it. Ron had rearchitected some of our code, and our Lights Out build was ready to go. But we wanted to have another idea playtested at the EAE day- our Detonate mode. It involved firing a projectile which would travel along the tube. You were in charge of when to trigger it, causing it to explode. This would permanently get rid of a small portion of the tube. You had to make your opponent fall out of the tube a set number of times to win.

We had to recreate this mode in our rearchitected code, which proved harder than expected. We were in the lab uptil 1.30 a.m trying to get it to run. We finally managed to get it ready, and went home for a temporary respite.

The next day, when I arrived at the venue for EAE day, we realized the controls for our builds were really faulty, and I immediately got to work on them along with Skip. We were able to get a build with acceptable controls ready, and available to our producers, so that they could load them up onto the computers which would be used for demonstrating our game.

EAE day was a amazing. A host of people had come to see our games, and I even met an industry pro who had worked on sound for games such as Dota 2, and successful movies like Avatar. Our game was well received by the audience, but it had noticeable camera issues, which I plan to fix ASAP.

That’s it for this semester. I foresee a super busy final semester coming up 🙂

Rendering Sprites!

For this assignment, we had to render sprites ( the stuff HUD’s and GUI’s are made up of)

A sprite is an image which is rendered on the screen. It consists of the most basic structure which enables us to render an image on screen- a rectangle. However, unlike other meshes in our game, the sprite will not be affected by any changes in the position or direction of the camera. It will always be the same size, and will render at the same position on the screen. The reason? We specify screen coordinates in order to render sprites. The vertex shader does not perform transformations to bring the vertex positions from model space to world space and so on. They are already in screen space to begin with.

I created a separate vertex shader for my sprites, which simply output the positions without transforming them in any way. The new fragment shader I created for rendering sprites simply sampled the necessary texture, and output the appropriate color.

I created a Sprite class for my engine, here’s how it looks



It basically had a Vertex Buffer pointer, which holds its mesh data, a Quad, which specified the dimensions of its rectangle( in my case, the top left corner of the rectangle, its width, and its height), a Material pointer, which contained its shader information, and also additional information in case it acted as a sprite sheet. I used the following sprite sheet in my game


This is what my game looks like when the current element pointed to by the sprite sheet is 1.


And heres what it looks like when the current element pointed to by the sprite sheet is 4.


The sprite class also shared a NativeResolution object, which informed it what resolution their dimensions were specified for. If the resolution changed, I would scale the Quad width and height accordingly, so that the texture size was unaltered. The top-left corner of the Quad was specified in normalized screen coordinates, and would remain the same across all dimensions.

Here’s what my sprites look like at resolution 1000 X 1000.


and here’s what they look like at 1600 X 800


( It’s may not be evident from these pictures, but their size has remained the same!)

I also changed some of my debug events for PIX, so that they looked more organized.

Here’s how the events look like now


In order to render sprites, the depth test needs to be turned off. Here’s PIX showing the disabled depth test during the draw call

Depth test

We did not use alpha values for our meshes. However, the same is not true for our sprites, and we enable alpha blending before we draw them. Here’s PIX confirming that our blend state is enabled



I also added a simple way to add sprites into the game. I call this the “spriteInformation” file. It contains information about where the sprite appears on the screen, what it’s dimensions are, and what material it uses. It also has the native resolution for which the sizes were intended.Here’s what it looks like




And that’s it for this assignment. Here’s the link to download the executable: Click Here

You can press the plus key on the keypad to move through the sprite sheet!