Skip navigation

Monthly Archives: October 2014

As we are roaring towards the IGF deadline, it was fall break a week before submission. Since we all knew that we had  lot to do, everyone agreed that we would through the break. We continued adding more juice, and making what we already had juicier. Other big changes we did were that we overhauled the boss that James had been working on. Earlier, it would simply aim to kill the player. But now we worked the boss, and our level design philosophy along with it. Instead of killig the player, the boss would not shoot down projectiles which would convert the existing ground into slow tiles. This meant that all the levels were reworked to have no slow tiles in them. As for me, I spent the last two weeks all over the place, helping pretty much everyone on the team with anything engineering related. During whatever I had on my own machine, I would fix bugs in pretty much anything at this point. One big issue, that was unforeseen was that Joe’s shaders weren’t working in a packaged project. The shaders worked fine in editor, but when packaging a game, the simply did not show up. I spent an entire day with him, first going through his shader to see if there was some weird issue there, then going through the post processing methodology that Unreal used, and then finally changing the shaders to work with the one way I found that would let me his shaders in editor and in game. This from a guy who is not at all into graphics programming, or had any idea how Unreal shaders worked. Other things I accomplished were stopping Brenton from going all sorts of crazy with particles, and helping him with matinee stuff to get a trailer ready. Also, Unreal released another update, which included a new way to do menus. So we overhauled our existing menu system to use Unreal’s new thing, which looks much better, and allows non- engineers to be able to design menus. With less then a week left for IGF, I know I’m going to be way more busy than I’ve been so far in this program.

For this assignment, we had the following two tasks:
1) Use a lua file to specify which assets to build, rather than sending them in through command line.
2) Create a ShaderBuilder project, which reads, compiles, and writes out compiled shaders before the game starts. The game then loads in these compiled shaders.

The AssetsToBuild.lua that I used is as follows:

AssetsToBuild

This file has sections which specify what type of file is being built (shader or generic for now). Then, I specify which builder program to use to build these. This is followed by a list of files, what their built (or target) names should be, as well as any arguments needed during the build process. This allows me to specify what the built files should use, rather than having them use their original names as it is.

The reason for the second part of the assignment, creating a ShaderBuilder, was that it saves game loading times at start up, as now the game doesn’t have to compile shaders whenever it starts up. It can simply read pre-compiled shaders, reducing the time it takes to get to the game. This also allows us to debug the shaders during the build, rather than having to load up the game every single time we made a change in them.

This was a relatively easy assignment, and didn’t take me very long to do. My time estimates are:

Reading/Understanding what to do: ~1 hrs
Getting the AssetsToBuild.lua working: ~0.5 hrs
Getting the ShaderBuilder working: ~1 hr
Getting the different file extensions working: ~0.5 hr

Total Time: ~3 hrs

You can download the working exe here.

Your browser may say that the file is malicious. This is because some browsers do not allow sharing executables from unknown sources. However, the file is perfectly safe to run.

After the big alpha last week, followed by the playtest we had, we had a lot of feedback to look at. We had another giant meeting, where listed all that we heard, what it means to us. Here is the feedback we receivedfeedback

The big takeaway we had was that we needed more visual feedback, and a lot more juice. To accomplish this, we started working on a bunch of things. Brenton went head first into awesome matinees and particle effects. He tried his hand at an opening matinee for the game, which slowly started evolving into a trailer. We also added glowy effects to our tiles (emissive materials EVERYWHERE). Joe had been working on a shader effect similar to Promethean vision from Halo, or even the eagle vision from Assassins Creed. He got that working, and were quickly able to get them in game, which had started looking amazing now. Another shader that he made would make the screen look very pixelated. We decided to use it as part of our penalty system system. I was the one who got both of them in game, working like we wanted to, but before we could do that, Joe asked me for help in fixing one final bug in his shader. This brings me back to what I talked about last week, that people ask me for help even though they know a lot more about this stuff than I do. But, as a result of me helping him, I had an idea of how his shader worked, which really helped me when I was getting them in game and making them a part of everything else we had. I used Unreal’s dynamic materials to add the shaders to post processing volumes, and since the dynamic materials allow me to change any available material in the shader, I got also control the amount of pixelisation that would happen on screen, easily tying it in with the existing penalty system to make it look awesome.

For this assignment, we had to do the following:
1) Draw a cube and a plane on screen
2) Be able to move the cube left and right, and in and out
3) Be able to move the camera left and right, and up and down

To accomplish these tasks, we had to do a lot of stuff. Since we were drawing two different shapes(meshes), I needed a way to tell the renderer what shape to draw, and where to draw it. For this, I made a Mesh class, whose header is shown below:
Mesh
Each object of the Mesh class now handles its own vertex and index buffers. The class also holds a reference to Game.exe window and the Direct3D device being used. Currently the Mesh data (vertex info and drawing order) is hard coded in to the class, but in the future it will made so that it can read that data from a file. Now, I just need to make the mesh objects, initialize them and then draw them to show them on the screen.

To achieve a movable camera, I made a Camera class as well. The header for that is:
Camera
The camera class holds information about the camera, like the Field of View, Near and Far planes, and the camera’s position.

To draw any object in a 3D environment, we first need to take it from its own space (model space), and find out where it is in the world. This done through a Model-to-World transformation. Then, once it is in the world, we need to figure out where it is relative to camera. This is done through World-to-View transformation. Finally, since all screens are in reality 2D, we need to make a projection of the object onto a 2D plane, called the screen. This is done by using a View-to-Screen transformation. For this assignment, we used perspective projection, as it is what the human eye uses. These three transformations are represented with 4×4 Matrices. The first one, Model-to-World, is based on the object’s position, rotation, and scale. The other two are based on the camera being used. The three matrices are calculated and passed into the vertex shader, which then uses them to decide where on the screen the object is to be drawn, if its visible. Here is a screenshot of the cube and plane:
Game
While deciding the drawing order for each face of the cube, we have to keep in mind that the indices used follow the left handed winding motion, and that the thumb is sticking out if we want that triangle to be seen, and inwards if we don’t it seen. Hence, all faces of the cube are visible, as the index buffer specifies that all faces are drawn, whereas for the plane, only the top “face” is drawn, and nothing else. If we wanted to see the bottom of the plane, the index buffer would have to specify another set of vertices to draw, but in a different order.

Here is a PIX capture of the program, with the Draw call of the cube selected, and with the shader being debugged:
PIX
The shader handles the calculation of the final position on screen, by using the matrices that sent to it. We can see the value of position_world that was calculated in that frame, by using the input position and the Model-to-World matrix.

You can download the exe below, and the controls are:
Arrow keys to move the cube (left and right keys move the cube left and right, and the up and down keys move it in and out of the screen)
WASD keys move the camera (A and D move it left and right, W and S move it up and down). When you move the camera left, the objects appear to move right, and vice-versa. A similar behavior is seen when the camera is moved up and down.

Time breakup:
Reading/Understanding what to do: ~2 hrs
Getting the Mesh class working: ~2 hrs
Getting the Camera class working: ~1 hr
Getting the two objects drawn properly and controls working: ~1 hr

Total Time: ~6 hrs
This took me surprisingly less time, though I do know that I didn’t spend time developing a robust architecture.

You can download the working exe here.

Your browser may say that the file is malicious. This is because some browsers do not allow sharing executables from unknown sources. However, the file is perfectly safe to run.

We hit our alpha this week! It may not seem much (considering it was us that decided that this arbitrary date is our alpha), but what it means for us is that we came up with a brand new design, listed out everything we need for it, set a deadline, and were able to hit that deadline exactly as we had planned to. Along with the big alpha came a big playtest as well, but before we could do that, the build kind of broke on the playtest day. Unreal had just released an update, and while it all looked good, it ended up breaking our menu system. And even though I had not touched the menu system up until now, I had to dive in and fix it, along with Abhishek. This, and the time I spent with Vinod fixing a weird flickering bug that we’ve had, along with all the times I would hear some on the team call out my name, because they were stuck somewhere (Unreal Blueprints make it very easy for anyone to go in and change what happens in the game) made me realize something. A lead engineer isn’t necessarily the best engineer on the team, nor the guy who gets the most done. His responsibility goes far beyond getting his tasks done. His responsibility is making sure that everything works. I’ve learned way more while helping people in bugs that they had, instead of when I’m working on a feature by myself. I’ve sat with Abhishek and seen how Unreal manages HUD elements, as well as level loading and general game management. The only way I know how the WaveEmitter (or premonition) works internally is because Vinod asked me to sit down with him to help him figure out a bug. Unreal matinee, which is something I probably would never have touched, is something I have rough idea about because Brenton’s been working on it, and occasionally asks for my help, even though I know less about them than he does. Same for the particle system. I know how James’ boss works, because I’ve helped him with some of the math involved in there. I have seen every single piece of code in the game, only because people ask me for help because I’m the lead engineer. What they don’t know is that I don’t know more than them, I’m learning from them and with them, and most of the time they find the solutions, but still credit me, while I was just there, acting more as a sounding board rather than instantly telling them the solution to everything. I realize that my role is the guy that helps everyone with any engineering problems, while also making sure that the different components work together. Also, I’m the guy that gets the weekly builds ready for play (not really happy about the last one, but seeing as we only have three engineers, some had to take the responsibility). And although I haven’t been doing an amazing job so far, my team trusts me and believes in me, which continuously drives me to become better.

Also, apparently I don’t know how paragraphs work 😛

And lastly: here is some Alpha gameplay footage that Matt recorded: