Engineering Two Assignment 11

Assignment 11 Download!

Controls:
0-9 Change the atlas. (Still changes lighting, so don’t be surprised when that happens… I’m running out of buttons)

So this week we added in UI elements that sit on top of the rest of the world objects. What’s cool about this is that we don’t need to do any transformations getting it from model to world to view to screen. We just know that we want to put it in a specific spot on the screen, so we can just throw it straight to the screen view. However, we want to make sure the images stay the right size that we want, despite resolutions changing. Look below!

600x600

1200x600

Although this looks kind of wonky initially, it’s still preferable to getting a stretched image or an artist needing to worry about all the different configurations. Anywhos… The biggest accomplishment this week is getting textures working again! 😀 Or…rather, in the first place. The problem was that my vertex shader was receiving the UV’s, but not bothering in passing them along, so they just … didn’t. So all I had to do is say the out-texture coords = the in-texture coords. Easy peasy.

Unfortunately, I feel like I duplicated a lot of code between the mesh class and the material class and their managers. It seems like a LOT of the code was similar and only needed to be tweaked slightly to adjust it to just handle a UI texture element, but it is what it is.

After getting textures on the screen, I had to make sure the transparency of the texture was being respected and I needed to turn on alpha blending and disable the z-buffer because we just want to throw the UI on top of everything no matter what. And then you need to toggle it back the next frame when you re-draw the shapes and other non-UI elements. You can see in the picture below that the alpha and depth settings are appropriately toggled for textures.

PixUI

After getting that to work, I had to work on a texture atlas. I’ve worked with a similar concept before, a sprite sheet, in a few previous games. The difference here is that I just needed to move the rectangle of where I was drawing and not need to worry about DirectX’s buffer junk. However, it’s really the same concept. You need to move the UVs that are attached to the vertices to wherever you want them to be. So you have an image that would have all the image states laid out in order like

0 1 2 3 4 5 6 7 8 9

And the UV (for my program) is across the whole image, like we’ve done with textures before. Then when you press 0-9, instead of going from 0 to 1 in the x and y directions, the UV is now 0 to 0.1 in the X and it remains the same for the Y – because we want the entirety of the height of the image to remain the same. So when you push 4, it moves the left side to 0.4 and the right side to 0.5. This makes the texture only display what is in that section. Pretty neat.

The actual getting this to pull off was a bit weird to me. You have to lock the DirectX buffer, which after you do that, it becomes available to write on, just like you put it in. I was worried I would have to rewrite the buffer or something. Really glad I didn’t need to worry about that. It still seems like it could be an inefficient way to do something if you need to do it frequently enough, but…maybe it’s not? I honestly don’t know. The other option could be… to have ten different textures and just choose which one to draw each frame. That seems like a lot of wasted space. Anyways, here’s the pictures of the different atlas images.

UI-2

UI-5

We also had a bunch of small house keeping things to make the game look better and easier to debug. We turned on linear filtering, which was just a couple settings that you just flick on when you initialize your DirectX device. Just cleans up the lines and makes the textures look a lot better.

We also improved the PIX debugging to have more detailed output. We added sprites, and we might as well have them sectioned off somewhere else so it’s easier to read through and find what you’re supposed to be debugging. Look:

PIXspritesep

You can see all the mesh stuff and resetting the Alpha blending and Z-buffer stuff is done in the Mesh section, and all the commands to turn off the depth buffer and re-enabling the alpha will go under the sprite stuff. And then you can see all the different calls to each sprite, just like we did with meshes

Problems I’m currently encountering:
In release mode, the UI seems to be affected by some sort of lighting effect. I’m not sure why this is happening. Check below!

ReleaseProblems

So here’s the things I’ve looked at:
In PIX, it’s recording the color like it shows up, and it isn’t being rewritten.
If I replace all the release assets with the debug assets, the problem persists, so it isn’t a build process that’s messing up.
If you restart it multiple times, the shading on the atlas textures will be different. That makes me think there’s some sort of default setting or something that is randomized…
None of my shaders for sprites are using lighting. Removing lighting does nothing.
Removing either UI element does nothing.
Removing the other objects in the scene does nothing.
I made sure that I didn’t have any logic set up in any asserts, because I remembered that being a problem for some people.
So that’s my problem. If I come at it fresh, I imagine that I’ll be able to think up why it may be happening.

Time Spent:
Initial sprite class setup: 1.5 Hours
Sprite rendering and fixing textures: 2 Hours
Atlas added: 1 Hour
Attempts at fixing the problem: 1 Hour
Writeup: 30 Minutes

Total: 6 Hours

Leave a Reply