Engineering Two Assignment 10

Assignment 10 Download!

If the last assignment’s .exe didn’t work, then use this one! It’s bigger and better. And doesn’t crash ( I hope :/ )

Controls for lighting:
I: forward
J: left
K: back
L: right
It’s like WASD, but for your right hand.

Directional Changes:
R: increase Red
F: decrease Red
G: increase Green
t: decrease Green
B: increase Blue
v: decrease Blue

Ambient Changes:
1: increase Red
2: decrease Red
3: increase Green
4: decrease Green
5: increase Blue
6: decrease Blue

So we added lighting to our game now. The first step was to take the Maya exporter and make sure that the normals were added in to the meshes. Then change the meshes so they are read in correctly. All of this has been done before with the UV stuff, so this was pretty much repeating that. We also had to adjust the vertex declaration that we pass DirectX so it knows how big the buffer needs to be and how to read it.

So why did we need to add normals? The biggest reason is so we know how much light is hitting the surface. If the surface is facing away from the light source, it should not light it. So if the normal is directly facing the light source, it should be fully illuminated. The easiest way to look at this is by using the dot product which gives you a number. What do we do with this number? If it’s above zero, then we know that the normal is in the same direction as the light source, which means it is facing away. If the number is negative, then it is facing the light source. If the number is zero, then it is perpendicular. If you normalize the dot product, you’ll get a number from 0 to 1, which you could directly apply to the surface for its color. We used this a lot in Mannequin, so it wasn’t too hard of a concept to understand. If you’re looking at a mannequin, it will chase you, if you look at it, it will stop moving. It’s basically the same thing. Except it’s the opposite. The light chases you when you’re looking at it. Actually, this analogy is breaking down really fast… :/

So after adding the normals, I got really weird looking objects. It was either the Maya exporter or maybe it was just because I didn’t have lights set up. I chose to believe it was the second. So we had to do all the fancy calculations that I mentioned in the previous paragraph about whether or not we would illuminate a surface. In the vertex shader, we really only calculated the normals and then passed it along. Since we passed the normals from a vertex point of view, the interpolate..er interpolate..or… it took care of smoothing the surface so it wouldn’t look quite so blocky when switching faces. This did not take in to account lights, however. It just calculated the normals and moved on.

After the interpolation, the data gets passed to the fragment shader, where it now calculates the lights in with all the fancy normals-business. So then we said “how lit are you based on your normals?” Then we took that lit-ness and multiplied it in to the color of the light. This gives the amount and the color that affects the surface. Then you apply it to the color and it gives you a final color. If you have an ambient color, it is added here to the color of the light before applying it to the surface.

So looking at my spheres, it was looking pretty good. Then I saw my cone and things went Twilight Zone on me, and I could see the bottom of the cone through the side of it. I thought maybe the cone might have exported wrong, so I made a torus and saw what that did. It also displayed the inside of the donut through the outside. I then assumed that my windings were wrong for the Maya Exporter, and sure enough, they were. So for the exporter, I just went through the list backwards to add it to the index list and that fixed everything. Without lighting, they were just flat shapes and I could never tell anything was going wrong. Finally, my objects did not look wonky.

Using PIX, you can look at an individual pixel and debug it and find all the draw calls that were made to it. This would have helped if it wasn’t the exporter wasn’t the problem in the paragraph above. This will show you the different times the pixel shader was called and the colors that came out of it. If your DirectX is set up to be in debug mode, this means you can step in to your fragment shader and find out what exactly is happening and you can actually debug the function for the shader. Look:

PixShaderDebugging

Pretty neat, especially if you don’t know how a shader works… Or how it isn’t working.

Time Spend:
Adding normals to the build process: 30 Minutes
Fixing problems from the previous assignment: 30 Minutes
Draw with lighting: 1 Hour
Fixing the index-order: 1 Hour
Adding in controls to adjust lights: 1 Hour
Writeup: 30 Minutes

Total: 4.5 Hours

Leave a Reply