- Create a TextureBuilder tool to create the texture for your project
- Your cube and your floor plane must render with different textures
Reading Time: 2 hours
Coding Time: 5 hours
Write-Up Time: 1 hour
Total Time: 8 hours
Download Link: http://blogs.eae.utah.edu/jdong/wp-content/uploads/sites/13/2014/11/game3.zip
1. DDS picture produce
The format of the texture is “.dds”. It is the format that DirectX used as Direct Draw Surface. We could use the dds tool provided by DirectX, which you could find in DirectX Utilities folder. It is very easy to get the dds picture by using this tool. But here we will do it by coding the TextureBuilder project.
Similar to the projects we used in the previous assignment, the TextureBuilder project would get the source path of the picture ( which here is the texture ) file and the target path, which is the data folder. I used JP’s texturebuilder file in my assignment.
For the function D3DXCreateTextureFromFileEx(), one interesting thing about this function is that the argument DWORD Filter. There are multiple filters. They are used for re-sizing. They are used if you specify a size other than what’s on disk, or if you use default size and don’t specify that non pow2 is okay, and the texture is not a power of 2 size.
For example, if you load a 48×48 texture, the load function will automatically make it a 64×64 texture, as it’s the next pow2 size that’s acceptable. If you specify a filter, it will use this filter to resize the image. The texture is filtered when loaded, and filtered more when rendered, creating quite a blurry image. If you specify a filter of NONE, it will load the image as is into the upper left part of the texture, and fill the remainder of the texture with black (possibly, and most likely, transparent black if alpha is part of the format). You’ll need to do more work to get accurate texture coordinates, but image quality will be better. Of course, you could just save the image as a pow2 size to begin with.
Here is the new file of my AssetToBuild file:
2. Texture Coordinates
In order to map the texture to the object, we need to understand the UV coordination. This system change the map the texture pixel ( float ) range from 0.0f to 1.0f to the object’s pixel value in integer. If a texture’s width is 256 pixel, the first pixel is mapped as 0.0f, and the 256th pixel is mapped as 1.of.
The vertical direction is “V” and the horizontal direction is “U”.
In order to add the new vertex information, I changed the vertex shader arguments and also the vertices format of the mesh file. Here is the new shader and also the new mesh file:
I add a new input argument i_uv and a new output argument o_uv. And in the definition of this function, I added the “o_uv = i_iv;”
3. Attach the texture to the object
There are several part I have changed. First is the structure of the s_vertexElements, the new format is like this:
Secondly, when the program reads the binary file, now it will read an extra argument, which is the uv value.
Third, set the texture of the direct device.
Here is the my game right now:
And here is the screenshot of the Pixel Tool:
And what I want to talk about more is the Texture Filtering.
In msdn, it is described as:
When Direct3D renders a primitive, it maps the 3D primitive onto a 2D screen. If the primitive has a texture, Direct3D must use that texture to produce a color for each pixel in the primitive’s 2D rendered image. For every pixel in the primitive’s on-screen image, it must obtain a color value from the texture. This process is called texture filtering.
When a texture filter operation is performed, the texture being used is typically also being magnified or minified. In other words, it is being mapped into a primitive image that is larger or smaller than itself. Magnification of a texture can result in many pixels being mapped to one texel. The result can be a chunky appearance. Minification of a texture often means that a single pixel is mapped to many texels. The resulting image can be blurry or aliased. To resolve these problems, some blending of the texel colors must be performed to arrive at a color for the pixel.
Direct3D simplifies the complex process of texture filtering. It provides you with three types of texture filtering – linear filtering, anisotropic filtering, and mipmap filtering. If you select no texture filtering, Direct3D uses a technique called nearest-point sampling.
In order to set the filter of the texture map, we need to use the function
[in] DWORD Sampler,
[in] D3DSAMPLERSTATETYPE Type,
[in] DWORD Value
1. Nearest-Point Sampling ( cost less, poor texture effect )
g_device->SetSamplerState(0, D3DSAMP_MAGFILTER, D3DTEXF_POINT);
g_device->SetSamplerState(0, D3DSAMP_MINFILTER, D3DTEXF_POINT);
2. Linear Texture Filtering ( between the 1 and 3 )
g_device->SetSamplerState(0, D3DSAMP_MAGFILTER, D3DTEXF_LINEAR);
g_device->SetSamplerState(0, D3DSAMP_MINFILTER, D3DTEXF_LINEAR);
3. Anisotropic Texture Filtering ( cost most, great texture effect )
g_device->SetSamplerState(0, D3DSAMP_MAGFILTER, D3DTEXF_ANISOTROPIC);
g_device->SetSamplerState(0, D3DSAMP_MINFILTER, D3DTEXF_ANISOTROPIC);
g_device->SetSamplerState(0, D3DSAMP_MAXANISOTROPY, 4);
The following are Nearest-Point Sampling, Linear Texture Filtering and Anisotropic Texture Filtering (from left to right ). And you can see the difference.
Linear Texture Filtering
Anisotropic Texture Filtering
Problems I met
1. uv mess problem
After I implement the texture, the texture on the object is just like in a mess. The reason is that the structure of the s_vertexElements has not been changed. After I changed the format of that, the texture shows normally.