Monthly Archives: November 2014

Camera concerns

With our new direction in mind, we decided to keep the environment simple and geometric. We had to concentrate on the more important issues, like camera and character controls.

I was tasked with improving the camera. Skip had done a pretty good job of having the camera adjust its distance from the player based on whether it was obstructed by a tile. However, it was a little abrupt. I changed the code a little so that the camera would zoom in quickly, but would zoom out at a slow rate, so that the camera didn’t look too jittery.

Assignment 10 – Adding Diffuse Lighting!

Our goal for this assignment was to add diffuse lighting to our game.

Diffuse lighting is a model of lighting in which we assume that all light incident on an object is reflected uniformly in all directions. In other words, an objects face will not be brighter ( or darker) if we view it from certain directions. As long as we can view it ( i.e, we are no standing behind it), it will appear the same. This model of lighting is much easier to simulate than specular lighting, where the angle at which the light is incident on the object determines where it is reflected.

How do we determine how well lit an object is? Surface normals! Each vertex has a normal associated with it , and it lets us know which direction is “outward” for the vertex.normals

 

If we have knowledge about the direction of our light as well, it is fairly easy to calculate the component of that light which will interact with our object ,using the dot product ( dot product of 2 vectors A and B is given by |A||B|cos(theta), where theta is the angle between them).

Hence, if our light source is anti-parallel to our normal, the dot product value (dot(normal, -light_direction)) will be 1, indicating that the all of the light will interact with our object.

If they were perpendicular, the dot product would give 0, indicating that the light would not interact with our object. We clamp negative values to 0 ( since negative lighting wouldn’t make sense).

How do we translate these values to the color of our object’s fragments? Uptil now, the color of each fragment was determined by the sampled texture color, the color from the vertices, and the color modifier constant. Now, we will add another factor into this calculation. We calculate the dot product of the negative of the  lights direction with the fragments normal, multiply this by the lights intensity, and factor it into the fragments color. This means that the fragment will be brightest when it is facing the light, and will gradually grow duller as it moves away from it. The intensity of the light will also determine the maximum brightness the color can attain. We also take into account some ambient light ( stray light arriving at the object after multiple reflections from surrounding object. These calculations would be too intensive to do accurately, and we just approximate ambient light). Even if an object is not receiving any light, the ambient light ensures that it is faintly visible.

Onto the actual implementation. We were already importing our meshes from maya, and had the normal data available to use. I changed my mesh format to take into account the normal data, and also had to change my mesh builder to read in the normals. I also had to change my vertex data to take into account the additional three floats for normals. The beauty of loading the mesh data as a binary file during run time is that we don’t have to change the way we load it, as it is already in the form we desire.

I added a light source class to my game, which has an intensity and a direction. This would store data for the lights I used in my game. I needed to pass normal data to the fragment shader, via the vertex shader, so that it could compute the lighting. I changed the vertex shader parameters so that they now accepted normals as well.

vertexSh

I transformed the normals from model space to world space in the vertex shader, and pass it into the fragment shader. I had to add additional constants to the fragment shader to account for light color and direction.fragconsts

The fragment shader would get the interpolated normals from the vertex shader, which would need to be normalized, since they were no longer unit vectors. It would then calculate the dot product and find the component of the light that would interact with the object.

fragLig

 

This would give us the right color for the fragment, taking into account lighting!

Here’s what my fragmentShader debug code looks like in PIX. It tells you how a particular fragment on my red cube got the color it is rendering.

DebugCap

cubeDebug

 

As you can see, the diffuse amount ( the output of the dot product) is quite low ( 0.282), indicating that the light is more perpendicular to the cube face, than anti-parallel.

I also added a scene file for this assignment, which would contain information about all the actors in my game. Here’s what it looks like

scenefile

I read this file in my game code to populate the world with actors.

This lets you create actors in the game by simply changing a human-readable file. You can decide its initial position, mesh  and material. ( I will be adding more functionality as and when I need it, like velocity, and collision system status). Adding meshes, materials and actors to my game is now easier than ever! Want a new mesh to be added to an actor? Simply add it’s name to the meshList.txt, change the scene file to say that the actor uses a mesh with that name, and you’re all set! ( You’ll have to add the mesh to the assets folder, and add its name to the AssetsToBuild lua file. Maybe I should have my AssetsToBuild file generate the mesh names from my meshList.lua file? )

I also added a lua file to hold lighting information. It is very basic right now, and contains information regarding the ambient light color, and also the color of the single diffuse light in our game.Lightinginf

 

 

Another way to make life easier for people who want to tweak values inside my game, but don’t really want to look at my code 🙂

Right now, my diffuse light is located above my objects, pointing straight down. You can use the left control and left shift keys to move it along the x axis ( left control = -ve x axis, left shift = +ve x axis), and right control and right shift keys to move it along the z axis ( right control = -ve z axis, right shift = +ve z axis). The light will adjust itself in such a way that it will always point towards the origin.

That’s it for this assignment. Here’s a zip file containing my game: Click Here

 

Assignment 9- Building a mesh exporter for maya!

Uptil now, we had been entering the position, color, and UV values for our mesh by hand. This approach is not feasible for more complex meshes, say, a torus, or a sphere, which has way more vertices. We address that issue with this assignment.

We created a plug-in for maya ( a 3-D modelling software), which would take a mesh you create in it and export in the human readable format we designed for our game. We could then read in this format and render the mesh in our game. Exporting from maya also gave us access to a host of other vertex data ( like normals and tangents), but I only export the data which my game uses for this assignment ( namely, position, color, texture coordinates, and triangle indices).

Maya uses a right handed coordinate system, as opposed to the left handed one used by direct-x. Hence, to take that into account, I had to invert the z- position values, and also the triangle index orders, so that the data conformed to what direct x expected.

Here’s what my pyramid mesh looks like ( it’s a lot less complex/ compact than my cylinder mesh, which has a ton more vertices)

 

PyramidMesh


 

I’m currently rendering 6 meshes in my game.

-A floor mesh ( floorMesh.txt)

-A pyramid mesh( pyramidMesh.txt)

-2 cylinder mesh ( cylinderMesh.txt)

-2 cube meshes (cubeMesh.txt)

I added controllers to the game, which determine how my actors behave.

I also added a world system from my previous assignments, which keeps track of all the actors in the game ( so that I can just loop through them in the render function and draw them!). Here’s how I assign meshes and materials to my actors now, and add them to the world.

GameCPP

 

Another addition for this assignment was player settings. We now have a settings.ini file in the game directory, which ships with the game, and enables the user to change the width and height of the window, and also to toggle fullscreen mode on or off.

SettingsFile

 

As long as the user enters a valid resolution, the game window will scale accordingly. However, if the dimensions are incorrect, the game will default to 800 width and 600 height.

I load in the settings file as a lua table, and go though the values independently to use in the game.

Here’s how I read the values

LoadingPIC

(Thanks to Vinod Madigeri for helping me with the IsThisWidthAndHeightSupported function, which validates the users height and width combination.

I also added custom events for PIX ( so that I can group events like settings materials etc)

 

PixPic

Here’s what my game looks like now!

GameAss9

Here’s a link to download the zip file for my executable: Click Here

 

Lights Out!

I prototyped my idea with the help of a hex grid last weekend. It was really interesting to study the coordinate systems you could use with a hex grid, as I had to figure out logic which would know if two tiles were connected on the hex grid.

I ended up storing two lists, which had information about all the tiles currently controlled by the player. Whenever a player captured a new tile, I would check for connections with existing tiles, and if I found one, I would grant all the connecting tiles to the player. I would also remove these tiles from the other players list. The hex grid system looked neat, and the mechanic seemed fun to me 🙂

We met in class on Tuesday and showed off our prototypes. Everyone had done a terrific job, and though people liked my idea, we all came to the conclusion that skips “Lights Out” idea was simple, and effective, and should be the way we move forward.

The idea was to give the users the ability to turn their tiles on and off. This gave the painting a lot more meaning.

This meant that we would be scrapping a LOT of our current game. Most of the environment art, the weapons system and the UI would have to be thrown away. To say that I was scared was an understatement. We were almost starting afresh, but I believe it was a necessary step in order to make our game fun.

 

Assignment 8 – Adding Textures and UVs!

Till now, we had been assigning colors to our vertices individually, and letting the interpolater handle the calculation for determining the color of each fragment in between. If we wanted two particular objects which used the same Material ( i.e, the same vertex and fragment shaders) to look different, we could only do so by changing the color modifier constant in the fragment shader for that particular object. But this only allowed us to change the overall color of the object by multiplying the color of each fragment by a constant. We could not, for example, draw a smiley face on the face of the cube we’d been rendering.

That’s where textures come into play! A texture is basically a picture ( a 2-dimensional array of colors). You apply this picture to the mesh you’ve rendered. But the mesh needs to know which vertex on it corresponds to which part of the image. This is done with the help of UV maps!

UV maps help to map a 2D texture onto a 3D object. If you imagine the edge of the texture to be placed on the origin of the U-V ( or X-Y) plane,  with the other edges of the texture at (1,0), (0,1) and (1,1), each point on the texture now has a  U and a V coordinate between 0 and 1. Now, if we assign a U-V value to each of our mesh vertices, we can map these vertices to points on the texture, and that’s how we can figure out the color the particular mesh vertex needs to have.

TriangleUV

!UvRepresent

 

 

Our goal for this assignment was to add textures to our meshes. The first step was to create a new builder tool, the “TextureBuilder”, which in my case would take .png files and output .dds files (Direct Draw Surface- A file format used by directX) into the data folder for the game.

The fragment shader is responsible for “sampling” a texture at a given U-V coordinate and returning the color at that location. These sampled values are interpolated between the vertices, just like the vertex colors. This U-V data needs to be passed to the fragment shader by the vertex shader, and to the vertex shader by us. Hence, I had to change my vertex format, and also my vertex and fragment shaders.

Here’s my vertex shader parameters list right now:

VertexParam

Here’s my fragment shader parameters list right now:
FragmentDecl

 

And here’s what my vertex declaration has changed to:

VertexDec

Our meshes now needed to store UV data as well. I altered my mesh file format to look as follows ( this is my new floorMesh.txt):

FloorUVs

 

I also had to change the way I parsed my human-readable mesh file, and wrote it in binary to my data folder. I also had to make small changes to my read code, so that I could offset the position from where I read my indices after reading the mesh file during run-time.

I now have a texture associated with each Material, and I’ve incorporated a path to the texture in the Material files. Here’s an example

NewMatfile

My Material class now holds a pointer to the texture it uses, and also the index of the sampler used by the fragment shader ( basically, the data I would need to set the texture before a draw call).  When you set a material now, it loads the texture in addition to the shaders.

Here’s part of my set material function, called before I render an object.

LoadMatfunc

 

 

And that’s it for this assignment. Doesn’t it look pretty!

 

Gamepic

 

Here’s what my SetTexture() call looks like in PIX:

 

 

 

Pixsettex

 

Link to download my current Assignment: Click Here

Controls are the same, arrow keys for moving the cube, keypad 2,4,6,8 to move the camera. Make sure the numpad is locked!

Assignment 7- Adding a Mesh file format and a Mesh Builder

We had been using hard coded vertex and index values for our meshes, but that is hardly feasible in a real game. Ideally, we would want our vertices and indices to be generated by a modeling software such as Maya, and then load the exported Mesh values into our game.

For this assignment, we had to create a human readable mesh file format which we would parse using a new Mesh builder, convert it into a binary file, and finally load the binary at runtime to use in our game. Human readability was of utmost priority, and with that in mind, I decided to use the following file format. Here’s what my Cube and Floor Mesh files look like:

CubeMeshFile

 

FloorMeshFile

 

I did not wish to cluster my data with letters, such as

x = -5.0, y= -1.0 ,z = -1.0,

as I felt that they hindered readability. The data seems self explanatory without the letter aids. Each mesh file would include the number of vertices in the mesh, the number of indices, the position of the vertices, and any other data, which in my case happened to be the vertex colors. Each table in the indices section corresponds to a triangle.

The use of arrays for the lua data also proved easier to parse. Here’s what the core of my MeshBuilder code looks like.

Meshbuildernew

This would convert my human-readable mesh data into a compact binary file which i could use in my game.

meshnew2

 

I keep track of the number of values on the stack, so that if anything fails, I can simply pop that amount and go to the OnExit label.

I used a single read for the file ( which reads the entire file into an appropriately sized buffer), and a single write as well, which copies the binary data into the data folder of my game. I purposely tried to keep these reads/writes low, as these can bottleneck performance.

Here’s howthe memory I copy the mesh data into looks likeMemPic

Here’s how I read the data in my game ( my Mesh class does it, and creates the vertex and index buffers in the process)

Reading the data:

ReadingMeshdata

Once I have the mesh data read, I can simply use memcopy ( which copies a chunk of bits) and copy the vertex info into the vertex buffer! I can do the similar procedure for the index buffer ( thus saving me the trouble of iterating through the data and filling it into the buffer individually)

Filling the vertex buffer

Fill vertex buffer

Filling the index buffer

Fillingindexbuf

I also altered my Actor class to now hold a Mesh* and a Material*( The structures finally falling into place!). The Mesh class holds information regarding the vertex and index buffers, while the Material class holds information regarding the vertex and pixel shaders.

So now, when I initialize my game, it first goes through a list of materials and meshes( both lua files), and loads them into maps.  Whenever I create a new actor, I just have to assign it a material and a mesh from this map, and I should be good to go! This also assures that meshes aren’t duplicated for objects using the same ones, and changes in one mesh are reflected on all objects using them.

Here’s my initialize function:

InitializeFunc

 

How I initialize my MeshMap. ( Similar functionality for MaterialMap)

 

MeshMap

 

 

I had to change my render calls so that they used the vertex and index buffers from the mesh object in my actors ( not too difficult).

Thats it! Here’s the zip file for this Assignment: Click Here

The controls to move the cube are the same, arrow keys for the cube, and numpad 2,4,6,8 to move the camera. Make sure your numpad is locked!

New Directions?

During our stand up meeting in class post IGF, we started talking about how our game seems really busy, with a lot of things happening on screen. Also, our painting mechanic did not have a strategy associated with it, and that was making our game dull.

We discussed for a long while about how we could incorporate mechanics which could make the game better. It took us the best part of two classes to shortlist a few ideas which we would like to prototype.

I had proposed a mechanic in which, if the user paints two tiles which are connected horizontally/ vertically/diagonally, he also gets all the tiles in between. There were around 4 other ideas pitched as well. We decided to let the people who had pitched the ideas prototype them, and have them ready by the following week.

I really love prototyping new ideas, and I’m looking forward to prototyping my idea this weekend

Hostile Territory has been submitted to IGF!

We got our game submitted to IGF this week. It was a great achievement for our team. I feel proud to be part of it 🙂

I worked on the UI elements, and had to update the MenuManager , since some of the UI elements could be scrolled to horizontally and vertically as well ( I had created a single dimensional array of Actions for this, which had to be changed to a 2-D array to enable motion in both directions)

I also had to make sure the textures would scale properly for different resolutions. I achieved this by specifying the screen location and dimensions in terms of 1920X1080 ( the resolution I was working on) and scaling it for all other resolutions during load time.

I also had some trouble figuring out how to mask part of a sprite in Unity, but found a way to do it with the BeginArea() and EndArea() functions.

It was a really tiring week, and I’m looking forward to a good nights sleep this weekend