First post in 2016

I’m not sure what’s wrong with me that I start to write this in week 9… But I’ll keep up and make total to be 8.

For the last a few weeks. I’ve been working on 1. revising dialogue manager, 2. eliminate bugs and 3. implement new functionalities.

Due to carelessness, our existing code used to be dirty and buggy. Some code are in JS while most of the others are in C#. Dialogue manager is a key part in our game where conversation goes on. I transplant that to C# and make it cleaner. Also resolved several bugs where people have been complaining for a while. New functionalities such as touch-angry and pick-up animation has been implemented.

About EAE day

So I was in the site at 12 to set up the hardware. Since then people are attracted by the tech we have. I stand there to help them with the game play through. After 5 pm I’m totally exhausted. I also talked with Eric to make sure what to do during intersession to improve our game.

Day before EAE day

I arrive at 1 pm, keep working on UI related adjustment because we decided to use VR for EAE day. Also continued to get bug out of the game. I feel kind of worried that sometimes people are too careless about these red errors in unity. That play in unity editor and play in unity build has different effects. As a result, we need to get rid of any possible errors existing in the game. I was working till 1 am to solve the final bug when everybody cheers for me. That’s a hard but solid day.

Recent activities

Since the last presentation. I’ve focused on polishing the scene. Testing and fixing bug. I fixed some collider bugs and also make sure player will never drop out of the scene. Also, some logic interacting with the suspect is fixed to make it more natural. I also fixed the ui and added in start scene. I’ve been spending sometime with our project every week to find out where i could work on and fix it. We definitely need more assets and probably clean up some unused code and assets to make working more efficient. I’ll keep on with this and also tasks my teammates assigns to me over slack, email or trello.

Before IGF

In the past 2 weeks, I was working the following.

  1. Due to the scale change of the scene, changed distance judgement, certain “ui” like effects which are actually working in world coordinate. I have to manually tune each one of them because we don’t have a overall wrapper object to contain these relative changes.
  2. Due to the change of the animation that the suspect will now stand up and talk to you. The used logic for turning suspect towards the player will not take into effect appropriately anymore. The new logic ensures that the the code will know when the standing up animation is playing and make the turning together with standing up, which makes it very realistic that in real life no one take turning and stand up as two steps.
  3. Add a pause and play mechanics into the game with the corresponding ui.
  4. Rearranged every text position on our screen. When our game is being played, there will be 3 different sentences showing on the screen. They’re suspect replies, player sentence and also the word to prompt the player to speak. These words should not be overlapping with each other or ui. I rearrange the size and their position according to screen coordinates to achieve a better visual effect.
  5. I added in the prompt at the right timing for prompting the player to speak to the suspect. Also, out of dialogue prompting is added that the player will be guided and speak to the suspect to start the dialogue. This is an important part helping user know what should be done at when.
  6. Change plenty of minor parts in code to get better overall effects. Such as tuning the inter-sentence waiting time etc.

Hopefully we’re ready for IGF soon 🙂

making progress

During the past two weeks. I’ve been working on basic game mechanics and UI stuff. The following is what I added.

Our game has a distance check, neither too near nor too far from the target is good. There’s also coming up buttons indicating if current status is acceptable.

We also have a target box now with a text next to the box showing how angry our target is. These need to be adjusted to be always next to target’s head. Also, 3d text in unity are not shaded by any objects by default. A shader is added to its material to make it realistic that it couldn’t be seen through the wall.

Also, a huge bug since half years ago is solved. The player doesn’t move in the direction of input. It’s caused by a very covert script turn on. It’s very necessary for us to keep code and assets clean.

Busy semester 1st update

The previous technique blogs in this semester can be found in:

http://www.engineercharliezhao.com/blog/eae6320-assignment01

http://www.engineercharliezhao.com/blog/eae6320-assignment-02

http://www.engineercharliezhao.com/blog/eae6320-assignment03

These are links to describing studying rendering techniques.

So I redesigned the buggy ui of our project. Our ui components used to be in wrong position for a while. And it doesn’t change accordingly when you try to modify screen size and aspect ratio.

It takes me 8 ours to redesign it including experiment to find out proper position of each component, writing code to auto adjust components. Add new functions to turn ui on and off. And also make changes to the existing ui animation. It looks good and response fast now.

I’ll be back on 17th joining the all hand meeting. Hope to discuss with my groupmates and discuss what’s the next step. In my mind it should be adding more character mechanics like a dog chasing player.

End of Game Engine Course

So yesterday I submitted the final assignment for game engine class. I made a arknoid-like game which has a ball bouncing between paddles, walls and bricks.

From the start of the semester, I gradually increase the modules in the engine. My engine currently have 4 sub-systems. World system keeps track of the objects in the game, it act as the interface to the game that game can get certain objects by using FindByName helpers. It also has some initialization and end up cleaning functions. Collision system comes next. It deal with any possible collision between two objects using collision type and collision mask. It will find all collisions in frame time and process them, then update game depending on the result returned. Here the third system called physical system comes. It has physical update functions which is called by collision system to update the game for a certain amount of time. If there’s no collision in a certain frame, update function will direction advance the game into next frame. Renderer system stores textures for objects as well as keeping a list of renderable objects. It’s responsible for putting objects on the screen. There’s a final fifth major system in behind. That’s messaging system. It utilize function callbacks to call certain responders who have registered for a certain message when the message has been sent.

My engine works pretty well, and I hope it could get better after taking the AI class next semester.

End of Narrative

So the narrative class goes to an end. We polished our game book for another time before the final due time. I tried to modify my basic grammar problems, also add another ending to it.

I also played and gave feed back to 10 other people. Some of them are really good. I see my story need more direct description of dialogues. If not doing that, it’s just a plane story with no characteristics. I tried to do that in the end.

During the semester, I got the chance to see some really good story telling games like the wolf among us etc. Story telling should also be valued high in our future project. Our current police simulating game is a good platform to evolve plenty of plots by going into different branches. Hope we get the chance to add them.

summary for EAE day

We have a good exhibition on EAE day.

I also see problems we need to solve for the next coming period.

It’s worth noticing that the facial expression capture is too sensitive and response in real time. We possibly need to add an integral function to output a value representing a certain period, not one sample point. If so, there will be a much higher chance for players to be able to get a better control.

Also we need to continue working on voice analyzing, and body parameter capture system. What we were showing on EAE day is simply just facial expression capture system. Sound analyzing will be an interesting area I want to work on. Hope we can get started soon.