Engineering Two Final Assignment

Final Download!

Controls:
Up and Down arrow control the orange truck
8 and 2 on the numpad control the green truck
Keep the ball bouncing back and forth
F1 turns off debug lines

It’s basically pong. A really really terrible version.
Final

Overview of the game:
As the ball bounces back and forth, it will change color depending on who hit it last.
The score increases as you miss the ball. It is done using 2D elements and sprites.
I chose Pong because it’s a very simple game and I could hack together the collisions and reactions without actually creating a collision system. It also doesn’t have too many rules, so it made sense to keep the game side of things simple because the important part of this class was the build process and getting art assets into the game.
I didn’t really add anything from the class in to the game. It was already there from the rest of the assignments. It did help make the asset creation easier and more robust. I have material and mesh managers that load in meshes and materials one time and then objects can request a pointer to the mesh or material when it is needed for drawing. I found this was a great way to remove duplications in data.

I actually liked the class a lot. I really liked creating all the build tools. I think that having taken a really basic graphics class in the past really helped me in this class, even if it was for OpenGL. The most interesting things learned were the build process from Maya to a built asset, messing with shaders (which were a total enigma to me before this class), and lighting. Oh, and also the proper way to read in meshes. In the past, I had done all of these things really wrong, and it was nice to see the right way to do it. Or at least one of the right ways. Before, I was just finagling with things and making them work, even if they didn’t work well at all. Anyways, I learned a lot, and although some of the assignments I got frustrated with, I had an overall experience with the class.

Engineering Two Assignment 11

Assignment 11 Download!

Controls:
0-9 Change the atlas. (Still changes lighting, so don’t be surprised when that happens… I’m running out of buttons)

So this week we added in UI elements that sit on top of the rest of the world objects. What’s cool about this is that we don’t need to do any transformations getting it from model to world to view to screen. We just know that we want to put it in a specific spot on the screen, so we can just throw it straight to the screen view. However, we want to make sure the images stay the right size that we want, despite resolutions changing. Look below!

600x600

1200x600

Although this looks kind of wonky initially, it’s still preferable to getting a stretched image or an artist needing to worry about all the different configurations. Anywhos… The biggest accomplishment this week is getting textures working again! 馃榾 Or…rather, in the first place. The problem was that my vertex shader was receiving the UV’s, but not bothering in passing them along, so they just … didn’t. So all I had to do is say the out-texture coords = the in-texture coords. Easy peasy.

Unfortunately, I feel like I duplicated a lot of code between the mesh class and the material class and their managers. It seems like a LOT of the code was similar and only needed to be tweaked slightly to adjust it to just handle a UI texture element, but it is what it is.

After getting textures on the screen, I had to make sure the transparency of the texture was being respected and I needed to turn on alpha blending and disable the z-buffer because we just want to throw the UI on top of everything no matter what. And then you need to toggle it back the next frame when you re-draw the shapes and other non-UI elements. You can see in the picture below that the alpha and depth settings are appropriately toggled for textures.

PixUI

After getting that to work, I had to work on a texture atlas. I’ve worked with a similar concept before, a sprite sheet, in a few previous games. The difference here is that I just needed to move the rectangle of where I was drawing and not need to worry about DirectX’s buffer junk. However, it’s really the same concept. You need to move the UVs that are attached to the vertices to wherever you want them to be. So you have an image that would have all the image states laid out in order like

0 1 2 3 4 5 6 7 8 9

And the UV (for my program) is across the whole image, like we’ve done with textures before. Then when you press 0-9, instead of going from 0 to 1 in the x and y directions, the UV is now 0 to 0.1 in the X and it remains the same for the Y – because we want the entirety of the height of the image to remain the same. So when you push 4, it moves the left side to 0.4 and the right side to 0.5. This makes the texture only display what is in that section. Pretty neat.

The actual getting this to pull off was a bit weird to me. You have to lock the DirectX buffer, which after you do that, it becomes available to write on, just like you put it in. I was worried I would have to rewrite the buffer or something. Really glad I didn’t need to worry about that. It still seems like it could be an inefficient way to do something if you need to do it frequently enough, but…maybe it’s not? I honestly don’t know. The other option could be… to have ten different textures and just choose which one to draw each frame. That seems like a lot of wasted space. Anyways, here’s the pictures of the different atlas images.

UI-2

UI-5

We also had a bunch of small house keeping things to make the game look better and easier to debug. We turned on linear filtering, which was just a couple settings that you just flick on when you initialize your DirectX device. Just cleans up the lines and makes the textures look a lot better.

We also improved the PIX debugging to have more detailed output. We added sprites, and we might as well have them sectioned off somewhere else so it’s easier to read through and find what you’re supposed to be debugging. Look:

PIXspritesep

You can see all the mesh stuff and resetting the Alpha blending and Z-buffer stuff is done in the Mesh section, and all the commands to turn off the depth buffer and re-enabling the alpha will go under the sprite stuff. And then you can see all the different calls to each sprite, just like we did with meshes

Problems I’m currently encountering:
In release mode, the UI seems to be affected by some sort of lighting effect. I’m not sure why this is happening. Check below!

ReleaseProblems

So here’s the things I’ve looked at:
In PIX, it’s recording the color like it shows up, and it isn’t being rewritten.
If I replace all the release assets with the debug assets, the problem persists, so it isn’t a build process that’s messing up.
If you restart it multiple times, the shading on the atlas textures will be different. That makes me think there’s some sort of default setting or something that is randomized…
None of my shaders for sprites are using lighting. Removing lighting does nothing.
Removing either UI element does nothing.
Removing the other objects in the scene does nothing.
I made sure that I didn’t have any logic set up in any asserts, because I remembered that being a problem for some people.
So that’s my problem. If I come at it fresh, I imagine that I’ll be able to think up why it may be happening.

Time Spent:
Initial sprite class setup: 1.5 Hours
Sprite rendering and fixing textures: 2 Hours
Atlas added: 1 Hour
Attempts at fixing the problem: 1 Hour
Writeup: 30 Minutes

Total: 6 Hours

Engineering Two Assignment 10

Assignment 10 Download!

If the last assignment’s .exe didn’t work, then use this one! It’s bigger and better. And doesn’t crash ( I hope :/ )

Controls for lighting:
I: forward
J: left
K: back
L: right
It’s like WASD, but for your right hand.

Directional Changes:
R: increase Red
F: decrease Red
G: increase Green
t: decrease Green
B: increase Blue
v: decrease Blue

Ambient Changes:
1: increase Red
2: decrease Red
3: increase Green
4: decrease Green
5: increase Blue
6: decrease Blue

So we added lighting to our game now. The first step was to take the Maya exporter and make sure that the normals were added in to the meshes. Then change the meshes so they are read in correctly. All of this has been done before with the UV stuff, so this was pretty much repeating that. We also had to adjust the vertex declaration that we pass DirectX so it knows how big the buffer needs to be and how to read it.

So why did we need to add normals? The biggest reason is so we know how much light is hitting the surface. If the surface is facing away from the light source, it should not light it. So if the normal is directly facing the light source, it should be fully illuminated. The easiest way to look at this is by using the dot product which gives you a number. What do we do with this number? If it’s above zero, then we know that the normal is in the same direction as the light source, which means it is facing away. If the number is negative, then it is facing the light source. If the number is zero, then it is perpendicular. If you normalize the dot product, you’ll get a number from 0 to 1, which you could directly apply to the surface for its color. We used this a lot in Mannequin, so it wasn’t too hard of a concept to understand. If you’re looking at a mannequin, it will chase you, if you look at it, it will stop moving. It’s basically the same thing. Except it’s the opposite. The light chases you when you’re looking at it. Actually, this analogy is breaking down really fast… :/

So after adding the normals, I got really weird looking objects. It was either the Maya exporter or maybe it was just because I didn’t have lights set up. I chose to believe it was the second. So we had to do all the fancy calculations that I mentioned in the previous paragraph about whether or not we would illuminate a surface. In the vertex shader, we really only calculated the normals and then passed it along. Since we passed the normals from a vertex point of view, the interpolate..er interpolate..or… it took care of smoothing the surface so it wouldn’t look quite so blocky when switching faces. This did not take in to account lights, however. It just calculated the normals and moved on.

After the interpolation, the data gets passed to the fragment shader, where it now calculates the lights in with all the fancy normals-business. So then we said “how lit are you based on your normals?” Then we took that lit-ness and multiplied it in to the color of the light. This gives the amount and the color that affects the surface. Then you apply it to the color and it gives you a final color. If you have an ambient color, it is added here to the color of the light before applying it to the surface.

So looking at my spheres, it was looking pretty good. Then I saw my cone and things went Twilight Zone on me, and I could see the bottom of the cone through the side of it. I thought maybe the cone might have exported wrong, so I made a torus and saw what that did. It also displayed the inside of the donut through the outside. I then assumed that my windings were wrong for the Maya Exporter, and sure enough, they were. So for the exporter, I just went through the list backwards to add it to the index list and that fixed everything. Without lighting, they were just flat shapes and I could never tell anything was going wrong. Finally, my objects did not look wonky.

Using PIX, you can look at an individual pixel and debug it and find all the draw calls that were made to it. This would have helped if it wasn’t the exporter wasn’t the problem in the paragraph above. This will show you the different times the pixel shader was called and the colors that came out of it. If your DirectX is set up to be in debug mode, this means you can step in to your fragment shader and find out what exactly is happening and you can actually debug the function for the shader. Look:

PixShaderDebugging

Pretty neat, especially if you don’t know how a shader works… Or how it isn’t working.

Time Spend:
Adding normals to the build process: 30 Minutes
Fixing problems from the previous assignment: 30 Minutes
Draw with lighting: 1 Hour
Fixing the index-order: 1 Hour
Adding in controls to adjust lights: 1 Hour
Writeup: 30 Minutes

Total: 4.5 Hours

Engineering Two Assignment 9

Assignment 9 Download!

Meshes you can overwrite:
coneMesh.lua
sphereMesh.lua
-Both of which are found in the Assets folder.
The above is just to make grading easier to find the locations of meshes to overwrite to test my exporter and build process.

So this week we added a plugin for Maya to export a mesh file to a lua file for our project to read in. My exporter takes the mesh and makes it look like all my previous mesh files. These are all in lua and it saves a HUGE amount of time if we’re doing anything other than a square. After it gets exported, it gets thrown through the MeshBuilder like before and things go on their merry way. It just makes it easier to make complex shapes by not forcing me to do it by hand and let Maya export all the vertices and the index buffer.

So we needed to have a multitude of objects and various materials. I realized halfway through this that I had some things not working the way they should be. It was a simple thing that just took me a little while to figure out where things were going wrong. In the fragment shader, I was using “g_colorModifier” to adjust the colors in the materials and this must have been changed when we switched to a new shader recently and now was using “g_color_perMaterial.” After I figured that out, the materials were pretty easy to throw together and attach them to different meshes.

MeshesAndMaterials

In the picture above, the cone and the middle circle share the same material (orange). The middle sphere and right sphere share a mesh, but have different materials (orange and green, respectively). The floor mesh remains unchanged. I figured we didn’t need to complicated something that was only to help as a reference. It’s neat though, because you can use the same material on multiple things or use the same mesh for different objects and you don’t need to duplicate data.

We then added in a user settings project that allows the user to adjust values in a text file for screen width, height, and fullscreen capabilities. It does error checking and has a fallback in case no values were entered in correctly. The only time it runs in to trouble is on multiple monitors. It does crash on fullscreen if you have multiple monitors up. If you have a single monitor, then it’s no problem. Or if you just don’t run it in fullscreen.

Finally, we had to add in some small details for draw calls. In PIX, you can see below that each draw call is separated and doesn’t create a giant list of commands and you can break it up in to each draw call.

Debug readout

In release, these do not show up, because we don’t really want to see output from that in a release build of the game as can be seen below.

Release readout

Time Spent:
Maya Exporter: 1 Hour
Refactoring materials: 1.5 Hours
User Settings: 1 Hour
Fixing a problem with the .exe resulting from the refactor: 1 Hour
Writeup: 30 Minutes

Total: 5 Hours

Engineering Two Assignment 8

Assignment 8 Download!

So we were supposed to get a texture pipeline working and get textures into the game. I got the first part complete, but ended up running out of time before I could get the textures working. It’s almost there, just fell a bit short.

So once again, the build process wasn’t too bad to add textures into a binary format. I’ve got them loading into the game fine too. So the material handles its own texture. So after getting the build process fixed up, I needed to adjust my mesh file to add in UV, which is just a coordinate system to line up the texture’s image. The scale is from 0 to 1, and for cubes and squares it is pretty easy. Top left corner is 0,0 and bottom right corner is 1,1. It’s flipped from what you normally see in coordinate planes. So after adjusting the mesh file, I needed to change the builder to accommodate the new structure. Which means I needed to update the sVertex struct that makes sure that DirectX’s buffer knows we’re going to be putting in two more floats per vertex into its system.

I did need to duplicate my material, but that wasn’t too hard. My system was already set up to use multiple materials, so it was really a copy paste and change what texture was to be used for each material.

So where did everything go wrong? Well, we need to query the constants table for the fragment shader and figure out the register for the “g_color_sampler” From there, we need to tell DirectX what register and what texture to use. My register was always 0, so I’m not sure if that’s how it is supposed to look and I have the UVs messed up or if I’m missing something else entirely, but it is causing problems there. It’s still drawing everything, so my struct and how I passed in my data to DirectX is working. My textures compile, so I know it isn’t that. In PIX, I can see the texture getting set:

PixTexture

But you can see that the textures aren’t actually showing up in the full render:

NoTexture

So since it’s getting the texture and it’s being recognized by DirectX, but still not rendering the texture, I can only assume something is wrong with the UVs. The build process is still moving the UVs over like I expect them to. I changed the declaration for the sVertex to include 2 floats as texture coordinates:
const D3DVERTEXELEMENT9 vertexElements[] =
{
// Stream 0
//———

// POSITION, 3 floats == 12 bytes
{ 0, 0, D3DDECLTYPE_FLOAT3, D3DDECLMETHOD_DEFAULT, D3DDECLUSAGE_POSITION, 0 },
{ 0, 12, D3DDECLTYPE_FLOAT2, D3DDECLMETHOD_DEFAULT, D3DDECLUSAGE_TEXCOORD, 0 },
{ 0, 20, D3DDECLTYPE_D3DCOLOR, D3DDECLMETHOD_DEFAULT, D3DDECLUSAGE_COLOR, 0 },

D3DDECL_END()
};

And since color is still showing up, the only thing I can think of is that the declaration for the TEXCOORD might be something else. I messed around with the ground and inverted the UVs to make sure I wasn’t doing something wrong, but it still isn’t showing up. I tried to get the table index every frame, and it’s still turning out to be 0 every time. But since the textures are showing up in PIX, that actually must be working. So going back to UVs, it must not be…inputting them correctly? I’ll have to look in to that more closely.

If I had been a responsible person and actually did my work ahead of time, perhaps I could have figured out where I went wrong. Instead, I spent the weekend watching Blizzcon and shirking my duties. And before that, I just had no motivation after IGF. It isn’t an excuse, but insight on to why I’m not finishing.

Time Spent:
Texture builder and game-side loader: 1 Hour
Adjusting all the mesh, vertex updates, etc: 1.5 Hours
Debugging and failure: 1.5 Hours
Writeup: 30 Minutes

Total: 4.5 Hours

Engineering Two Assignment 7

Assignment 7 Download!

So this week, we moved our meshes over to use LUA to make it easier to create human-readable files, for debugging and various other things. So…lucky for me, I already had my mesh moved over to LUA from a previous assignment. Whoo! Saved myself a lot of time this week trying to figure out reading in LUA. Then we created a system to change the file from LUA to binary. Here’s the before:

return
{
vertexBuffer =
{
{
vertex = { -0.5, -0.5, -0.5 },
color = { 255, 0, 0 },
},

–This goes on for each vertex

{
vertex = { -0.5, 0.5, -0.5 },
color = { 0, 255, 0 }
},
},

indexBuffer =
{
–Front
0, 1, 2,
1, 3, 2,

–Back
7, 6, 5,
5, 4, 7,

–Left
4, 5, 0,
1, 0, 5,

–Right
2, 3, 6,
6, 7, 2,

–Top
1, 5, 6,
6, 3, 1,

–Bottom
2, 7, 0,
0, 7, 4
}
}

Here’s afterwards:
BinaryJunk
I had to take a screenshot, or else it looks like this:
$驴驴驴每每驴?驴每每?驴驴每每??驴炉每每驴驴?炉每每驴??每炉每???炉每每?驴?每炉每
Fancy! So how does it work?

When reading in from the initial LUA file, it worked the same way as it did in the original read in. I just needed to copy it over to a builder. After reading it in, the builder determines how many vertices and indices are in the file. So when writing it to the binary file, I did

NumberOfVertices NumberOfIndices BIGPACKOFDATAFORVERTICES BIGPACKOFINDICES

It’s pseudocode, however, thinking back, I might change my variable names to the above to make it sound more like He-Man coded it.
On the game side of things, you read in the binary file. There are a couple of ways you can do this. The easiest way, which I chose to do, was to do four reads. One for vertex count, one for index count, one for the big chunk of vertices, and another for the big chunk of indices. Granted, it does feel super dirty. The other option involves doing one big read and then just putting pointers in the appropriate places. For example, you know the vertex count is going to be first and you know the index count is immediately after it so you can just shift the pointer over. Then, since you have the count of the vertices, you can just move the pointer down again equal to the number of vertices times the size of your structure for the vertex.

So I ran in to a couple problems. First was that I just threw the color in as a float immediately after the point. So what’s the problem with that? Well, on the game side of things, it’s only expecting to use 4 bytes for color – represented as an integer. So…not three floats. Instead of just throwing a big list of floats into the binary, I had to create the struct for the vertex on the builder side of things so it knew how to throw it together. So that was problem one. After that, reading it in was fantastic.

The next problem was on the game side of things reading in the binary file. Oh, reading in was a piece of cake. It was basically the opposite of the writing process. And so two things happened on this side of things. Firstly, I didn’t realize that reading in to allocated memory wasn’t a copy: I assumed it was. But I remember talking about it in class, so it didn’t take too long to figure out to create a temporary variable for reading in the values and then copying them over. And then I tried to just set the buffer equal to the location of the copied values. So that didn’t work either, since DirectX gives you a memory location, and you need to copy your junk in to that, so it was a bit of a messy process copying everything and then reiterating through the values. I tried to take out the temporary variable, because that seemed like it was sound reasoning, but it stopped rendering. So that’s that.

Time spent:
Moving Read Over (also setting up debugging for the builder): 30 Minutes
First Pass on Writing: 1 Hour
First Pass on Read: 1 Hour
Debugging and Fixing Both Sides of Read/Write: 1 Hour
Writeup: 30 Minutes

Total: 4 Hours

Engineering Two: Assignment 6

Assignment 6 Download!

So this version of the game, we changed the shaders to be built whilst building assets, as opposed to building them at runtime. This helps performance because players don’t need to wait for the game to build the shaders, we only have to build the shaders once after development (for release), and we can also do any error checking before hand.

So we moved the build process over to the LUA side of things and created a file that holds what assets need to be built, as well as which builder needs to build them. Mine looks like this:

–[[
This file lists every asset to build
]]

return
{
— Generic Assets
— (That just get copied as-is rather than built)
–Handles meshes for the time being
generic =
{
builder = “GenericBuilder.exe”,
assets =
{
“cubeMesh.lua”,
“floorMesh.lua”,
“material.lua”,
},
},
–Shader builders
–Vertex Shaders
vertex =
{
builder = “VertexShaderBuilder.exe”,
assets =
{
“vertexShader.hlsl”,
},
},
–Fragment Shaders
fragment =
{
builder = “FragmentShaderBuilder.exe”,
assets =
{
“fragmentShader.hlsl”,
},
},
}

So once we load it into the C code that checks what needs to be built, it passes the file and its path to the appropriate builder. I split up the fragment and vertex builders so that I can debug each one separately, and I can create specific debug code if necessary. The code is virtually the same between the two, but should a problem arise, I will know it is a problem with one or the other instead of trying to figure out which part of the code is messing up with which assets.

I chose the above code because it’s easy enough to determine which builder to use, as well as the list of assets to build for the given builder. The outer table with
vertex =
or the
fragment =

are not used by the lua reader and are only there for user readability. They are perhaps a bit redundant, as you can tell which builder is being used…right below it. But I imagine if you want to break it up so
cheatCodes =
{
builder = “GenericBuilder.exe”,
assets =
{
“GodMode.cheat”,
},
}

It still uses generic builder, but maybe in a bigger file, it’d be easier to find certain files. That seems like grasping at straws, but it makes sense to me. Also, who would have a cheat file?

We also added some debugging code to make sure that should an error pop up during build time, it points us in the appropriate direction. If there’s a syntax error within the “AssetsToBeBuilt.lua” file, then double clicking in the error window will bring up the file in question and tell us which line is the offending one. Note: This does not actually check for profanity, so offensive lines will still get compiled.

Also, the builder has added debug code to the compiled shaders if they are built in the debug configuration.

Honestly, this assignment was unexpectedly shorter and easier than the others in the class. I spent ~2-3 hours on it, but I wasn’t focused on working on it very well, so it probably worked out to be an hour worth of work. There was a lot of code given by JP, and I was under the impression we’d be writing more lua code, but the files he gave us kind of just took care of it. The only code I had to change on the engine side of things was to adjust how the shaders were being stored (DWORD* instead of CompiledShader). I’m not necessarily saying this was bad, just that I didn’t…learn too much on this one. But I also wasn’t nearly as frustrated with it as other assignments either. :p

Time spent:
2-3 Hours: Adding in the build steps for the shaders
30 Minutes: Writeup

Engineering Two Assignment 5

Assignment 5 Download!
Controls:
Arrow Keys move the box forward, backward, left, and right.
WASD Move the camera forward, backward, left, and right.
Space moves camera up, and X moves camera down.

So in this assignment, we moved the rectangle into the third dimension, making it a cube. Starting off, it was pretty simple: just add the z component anywhere you dealt with coordinates. After that, you need to adjust the mesh for the cube to have six sides. I’ve worked on graphics briefly before, and I remember it taking a really long time. So I just hardcoded the values for the cube into the graphics file, just until I got it working. So after getting the easy part out of the way, you have to create a translation that moves the object (and it’s vertices) onto the screen. This involves creating a camera that holds all these fancy matrices that you can just call up every frame and translate the objects into the camera’s coordinate system. If you don’t do this, then the GPU wouldn’t know what to draw. This is adjusted like the position was in the previous assignment. We grab the vertex shader and adjust its constant values. We pass in the matrices that are the translations and rotations to put things in the camera’s view. Then it just ignores everything outside the -1 to 1 boundary.

The changes we made make it possible for us to have objects bigger than 1 unit to fit on the screen. Before, we were just drawing straight to the screen without taking in to account what the world looks like. This also allows us to make our square/cube actually look like a cube, rather than distorting it with window resolution. I did run in to a few issues, but that’s because when I was updating the cube’s position, I was inputting the position in to the worldToScreen matrix. It was a copy paste error and I ran around in circles for a bit trying to figure out why my object wasn’t drawing 馃槢

So now that we have a world that we can put things in, we added a floor for the cube to sit on. I had to refactor my code, because as I mentioned above, my vertex and index buffers were just…sitting in the graphics.cpp file. Which is bad. I didn’t want to have that spiral away any worse than it already had become. I fixed up my Mesh class, and thanks to Sid G asking about whether we should have more than one index/vertex buffer, it clicked how I could solve that problem. Any renderable will have a material and a mesh. I also have a MaterialManager and MeshManager, so we can keep track of all the materials and meshes we’ve loaded in, and we can just request a pointer to whichever one we need to use for a renderable object. The majority of my time spent on this assignment was actually just moving my mesh system over to reading lua files. SUPER cool. I actually had a lot of fun doing this section of the assignment, however, I did get stuck a lot. And I imagine this will help when we move over to using Maya files. It will probably save me time in the future. The mesh file has the vertices and the colors in one table, and in another, it has the index buffer. I…hope that I don’t need to fix it too much. I can see the index buffer part being particularly vulnerable to mistakes. But I guess doing graphics by hand will always be prone to that stuff.

After that, I broke the controls out so that they’re separate components, so that some things can move (such as the cube), and other things stay put (like the floor). This wasn’t that big of a change. I just made a parent control class and made the cube controller a child of it, and then just have that update position. I could have done the same thing with the camera, but I’d need to refactor a bigger chunk of my code than I’m willing to do right now.

Here’s the camera sitting on the ground:
RectangleStart
You can see the rectangle is sitting in the middle of the floor, and we’re viewing it head on.

Here’s after we move the box and the camera:
RectangleEnd
We moved the box to the corner of the floor and then raised the camera up and moved it to the left a bit. You can see more sides of the box now.

Time spent:
Changing rectangle to cube and adding camera: 3 Hours
Moving mesh system to lua: 3 Hours
Adding floor, fixing control scheme: 30 Minutes
Writeup: 30 Minutes
Total: 7 Hours

Engineering Two Assignment 4

Assignment 4 Download!
To move, use the arrow keys for great success!

So in this assignment, we changed how color works, and it’s no longer hard-coded in the C++ side of stuff, but is determined by the material file. This was added to my material file:
constants =
{
g_colorModifier = { 1.0, 0.5, 0.0, 1.0 }
}
OrangeRectangle

When reading the material file, it will look for g_colorModifier, and if it exists, then it renders as orange, as indicated above. Otherwise, it will take the defaults and render as grayscale.
If we change it to this:
constants =
{
g_colorModifier = { 0.5, 1.0, 0.0, 1.0 }
}
Then you get this:
GreenRectangle

Here, green is rendered. No need to change C++ code and recompile is necessary. You do need to rebuild assets, which will move the material into the necessary build location. Luckily, with the changes I made to the material system last time, I didn’t need to change much when it came to this. I just created a function that gets called once you read in the g_colorModifier that tells the material to grab the ConstantTable, which is a DirectX thing with all the constants inside.

The same function to change the g_colorModifier in the pixel shader could be used to adjust the location of the rectangle in the vertex shader. The only difference is that g_colorModifier uses a Vector4 (which is rgba), and the position takes a float array. It makes the exact same calls to the ConstantTable, with the exception of using the vertex shader instead of the pixel shader to get the constant value we’re looking to use.

We also refactored the build process to make it so different projects will build different assets. Currently, it still just moves things over using the GenericBuilder. So if we don’t have a specialized builder, it will copy the files to the data directory to be used for the .exe of the project. So if nothing is changed, it will still operate the same way it has for the last couple assignments: just moving things over. However, if we want to compile the shaders in the future, we could create a ShaderBuilder that would take care of it and change it to a format that would be easier for the .exe to read at runtime.

Time spent:
Changing the color system to use LUA and constants: 1.5 Hours
Adding in movement and the material constants: 1.5 Hours
Changing the build system: 1.5 Hours
Writeup: 30 Minutes
Total: 5 Hours

Engineering Two Assignment 3

Assignment 3 Download!

So this assignment was to change our previous single colored triangle into a single colored rectangle. If we only use triangles, my original thought was that we had to double the number of vertices used. Then there’s also triangle strips, which can get complicated, because you just add one more vertex per triangle added. The GPU assumes that the last two points for the previous triangle are the first two for the next one. This can make things kind of complicated without a tool to generate the triangle strip for you. However, the tactic we used was somewhere in between. So we used an index list along with the vertex list.

RectanglePicture

The vertex list holds all of the points on the rectangle, so in the picture above, it holds each corner in the list. Then the index list holds the order for the points. For the first triangle, the index list is 0, 1, 2. Then the second triangle is 1, 3, 2. This is a neat way to do it because you only have to store each vertex once, and only lose a little space for all the indices, which isn’t too bad at all. It’s also worth noting that DirectX makes the edges of the screen go from -1 to 1. So although by coordinates, it should be a square, due to the resolution of the window, it appears stretched into a rectangle.

PixRectangle

The picture above is just like the last assignment, but this time with more triangles! In the PreVS window, you can see the VTX, which is the vertices used, so the first three are for the first triangle. 3-5 is for the second triangle. In the next column, in the IDX, you can see the index list that I mentioned in the prior paragraph and the point that is associated with it. Nifty.

The next part of the assignment was to switch the hardcoding of which shaders to use into a more dynamic, reader-friendly approach. We created materials, which describe how things look, and essentially which shaders are used for an object. Using our previous LUA knowledge, we can convert a material into a basic LUA table that can be read in. I made a material manager class, which can store all the materials loaded in to the game. This way, you can load them all up during load time, and then just call the ones you need when you need them. A material is just a vertex shader and a fragment(pixel) shader. Taking this a step further, when we add in meshes that hold the triangle lists needed to draw an object, we could hold the mesh and the material in the object and call draw, and it will do all the work of making sure it’s using the right materials on the right mesh. My LUA file looks like this:

return
{
vertexShaderPath = “data/vertexShader.hlsl”,
fragmentShaderPath = “data/fragmentShader.hlsl”,
}

Fancy!

Time Spent:
Adding index buffer and triangle list: 1 Hour
Switching materials to a class: 1 Hour
Adding in LUA to read materials: 1 Hour
Writeup: 30 Minutes