Thursday 16 June 2016

Unit 70 and 71 - OOD - Blueprinting in Unreal 4




Blueprinting is a visual programming method used in Unreal 4 to tie all the separate elements together. Below I have attempted to describe several of the methods I have used to add interactivity to the game I have been developing:


AI
Setting up an enemy with some artificial intelligence that reacts to the player position is a fairly simple job. The following screenshots demonstrate the process with which this is achieved. All of this is set up within an actor blueprint called Enemy AI_1.

AI-2.jpg

When the game begins we set up a simple function node that tells the enemy actor to immediately move to a designated point on the map. The function logic flow is described below:

AI-3.jpg

While the enemy is moving, we need to check a few things - is the player within the enemy’s line of sight? If it is, run towards the player to attack. This is determined by a simple variable called inrange. If the player is in range of the enemy this variable will be set to true. If it is false , the logic above will carried out. This true / false condition is carried out by what is known as a branch. If true, the enemy will attack the player.  If not, the enemy should ‘flip flop’ between two points, pausing for a second at each before running to the other.

AI-4.jpg

When our enemy collides with the player, we want to subtract some percentage from the player’s overall health. With every collision we want to make a comparison with the value of the health variable to see exactly what state the player is in. When the player’s health bar falls below zero, we want to subtract 1 of the player’s 3 lives, and recharge the health bar to maximum for the next life. When all 3 lives are lost, we want the game to quit. We can also flash up a message on screen that tells the player that they have sustained damage, but are still alive.
AI-5.jpg

The player fireball should connect with the enemy and cause a collision to trigger, subtracting health from the enemy. This is similar to the player’s health system described above with with a few tweaks. With every hit, the fireball deducts 25% damage from the overall enemy health. Every time this happens, the enemy health variable is set to a new value. If the enemy health variable has not yet reached zero, this process will repeat until it does.

.When the enemy health reaches zero, we want to swap the enemy actor for an explosion emitter, then destroy the instance. This is done by comparing the current value of the health variable to zero with every hit. When at last the comparison is true, the explosion emitter will trigger, signifying the death of the enemy.

The explosion emitter makes sure it is appearing in the right place by getting the exact coordinates of where the enemy actor last was on the map. This is achieved through the ‘GetActorLocation’ and ‘GetActorRotation’ nodes.

AI-6.jpg

The player’s second weapon, the grenade will kill the enemy on impact The process works exactly the same as the fireball above, but causes far more damage per collision resulting in a quicker death for the enemy.


AI-7.jpg


Both of the fireball and grenade collisions need their own collision boxes to register what has hit the enemy actor. It is not possible to have one collision box for two different types of player weapon effects.

This is the viewport, which allows to see visually what elements currently make up the enemy actor. You can see both types of collision box in the component list on the left hand side of the screenshot. They are labelled:
  • BoxCollision_for_fireball
  • BoxCollision_for_grenade

Also in the menu on the left, under the ‘My Blueprint’ tab, you can see all the variables that the Blueprint intends to reference, including whether the player is in range (a simple true or false) and the current value of the enemy’s health, as described above
HUD
Setting up the heads up display to record the state of the player’s health, magic meter. Grenades and lives remaining is a reasonably simple process. It begins by making a simple widget blueprint and naming it ‘GameMode’. It is placed inside a folder called ‘UMG’ (short for Unreal Motion Graphics). Our first step is to open the newly created widget opening up the designer view.



HUD-1.jpg

From here, we create a series of name labels and progress bars, positioning them inside horizontal and vertical boxes. The white box on the grid above represents our computer monitor screen, or overall play area. Inside our third person character blueprint, we should also have set up a number of float variables to represent health, magic, grenades and lives.
HUD-2.jpg

We bind the variables to the progress bars and for each one we bind, are given the option to build a blueprint. The above example is for the lives bar. From the purple node in the top left, we are first telling Unreal to bind the lives progress bar to the actions and state of the third person character (TPC). The TPC blueprint obviously contains the variables we are working with, so by stating the ‘Get Player Character’ node, we pull in all the information and variables associated with it.Because we did this, we can then target the current value of the  ‘Lives’ variable and return it’s value to the progress bar. The progress bar is smart enough to know that a variable value of say, 0.75 will equate to 75% of the bar being filled.




Third Person Character
One of the trickier and more demanding blueprints to complete is the one concerning the TPC. The information it carries is in a constant state of flux, from states through to current position in the world. Beginning with the task of assigning the key F to shoot fireballs, we go back to the TPC blueprint and begin to create conditional checks and functionality:

TPC-1.jpg
On the far left, in red, we can see a node for the event concerning a press of F on the keyboard. I already set up a variable called ‘Magic True False’ to see if the player’s magic meter is full or empty. If it is full the value is 1, for true. If not it is 0 for false. This is the first comparison check that is performed. If equal to 1 the logic flow can proceed. The next step involves setting the magic variable with a new value - after all the player has just pressed the F key to fire, so some of the ‘Magic’ variable should be deducted. This is reflected in the instantly updated progress bar in the HUD (see above). The next step checks to see if the magic meter is now empty. If it is the ‘Magic True False’ variable is set to 0 and a string carrying an appropriate message is printed on screen. This obviously means that if the player were to press F again, the whole sequence would attempt to repeat but nothing would happen as the logic flow would not get past the very first initial comparison.

If after shooting the magic meter is still full however, the last comparison is ignored and the sequence can be repeated to shoot another projectile.
TPC-2.jpg
Other weapons such as the grenade are processed in exactly the same way. Logic flows can be reused by referencing different variables.

TPC-3.jpg
Remember the HUD widget ‘GameMode’ from earlier? This is the little bit of logic that draws it on the screen in the specified position. The red node on the left is an ‘Event BeginPlay’ which tells the logic to do something as soon as the game starts. In this case the widget is added to the viewport.

TPC-4.jpg

This is where the logic determines how to create a projectile on screen when the F key is pressed. We previously set up the Input action ‘Fireball’ as a key press of F. The first thing the logic does is check if the ‘Magic True False’ meter is set to 1 or 0 (full or empty.) If it is 1, the logic can proceed. If 0, an appropriate string message is printed on screen. Our fireball actor is a separate actor blueprint that contains the effects and velocity information. If the logic can proceed, the first thing we do is spawn this fireball actor. It’s appearance location is determined by the location of the TPC on screen. Much of the above logic concerns which way the actor is facing as well as where they are, so the fireball actor is spawned facing in the right direction of the TPC. Due to the setup of speed and velocity within the fireball actor blueprint it should then fly from this position and travel across the screen.

An interesting element to note here is the reference to the ‘Fireball Projectile Offset’, the small yellow node on the bottom left of the screen. This offset takes the TPC actor blueprint and allows the fireball actor blueprint to appear anywhere around it via X,Y,Z coordinates as an offset. If we wanted to shoot the fireball from the actors eyes or hands, this is where we would do it by altering the offset slightly.
TPC-5.jpg

This is where the logic determines how to create a grenade projectile on screen when the G key is pressed. We previously set up the Input action ‘Grenade’ as a key press of G. The first thing the logic does is check if the ‘Grenades True False’ meter is set to 1 or 0 (full or empty.) If it is 1, the logic can proceed. If 0, an appropriate string message is printed on screen. Our Grenade actor is a separate actor blueprint that contains the effects and velocity information. Unlike the fireball actor, it also includes elements such as friction, gravity and rolling. These will alter how the grenade acts with the environment when it is thrown - it can be heavy or light so may bounce off obstacles or be stopped on impact with them.

If the logic can proceed, the first thing we do is spawn this grenade actor. It’s appearance location is determined by the location of the TPC on screen. Much of the above logic concerns which way the actor is facing as well as where they are, so the grenade actor is spawned facing in the right direction of the TPC. Due to the setup of speed and velocity within the grenade actor blueprint it should then fly from this position and travel across the screen.


Key collecting
A gameplay element I was keen to include involves collecting keys to open locked doors. The player cannot proceed through the game world if they have not worked out how to navigate through these obstacles. The process below describes my thinking for how I put the logic together:


matinee_1.jpg

I made the door first that is surrounded by an invisible trigger box. When the player enters this trigger box, a logic flow is carried out to determine whether the three keys have been collected or not. The keys are also individual actors who carry a true / false variable to say whether they have been picked up or not. These are referenced by the door actor. I decided that if the player had all three then the door would slide down into the ground via an animation.
matinee_2.jpg

A level animation in Unreal is called a matinee. Above is the screen for setting up a matinee - we access by clicking the menu at the top of the viewport. Following this we create the movement by performing these actions:
1) Select the door in the viewport.
2) In the left hand section of the bottom half of the screen we right click the vertical grey section and choose ‘Add new group’. We name it.
3) With the door still selected in the viewport, we right click this new group and choose ‘Add selected actors’.
4 )Under this group we right click and select ‘Add movement track’.
5) In the bottom half of the screen we drag out the green timeline to about 5 seconds in.
matinee_3.jpg

6) With our timeline marker at the 5 second mark, we then select our door in the viewport and move it to where we want it to end up. In our case, we just pull the door straight down through the ground it sits on.
7) In the top left of the matinee window is a button called ‘Add key’. Clicking this adds a keyframe. A keyframe means that the computer can determine all the stages of movement that it needs to get an object to end up in a certain place, calculating all the animation frames in between. Our first key frame is a normal door. Our second keyframe of the door puts it below the first, out of sight. When we click play in the matinee timeline we can see the door slide downwards and stop.
8) We make sure that the matinee is saved with a name of our choice. This matinee will be referenced by the level blueprint later on.

matinee_6.jpg

This is the logic flow that determines whether or not that the player has all the keys collected and whether the door will open. The red event node on the left is called ‘OnEventBeginOverlap’ and refers to the door actor. We need to reference exactly what should be triggering the script upon colliding with the door, which is of course the third person character. We have already set up 3 variables within the player character blueprint that determine whether the player has picked the keys ups. These are ‘Key 1 True False’, ‘Key 2 True False’, and ‘Key 3 True False’. Initially set to false, each variable becomes set to true when the player collects the relevant key actor. The logic above works it’s way through comparing each of these variables with 1 - true. If any variable still equals 0, false, the logic for triggering the opening of the door cannot complete. Instead a message prints informing the player they still have to collect that key.
If the logic reaches the end and all key collection variables are set to true, the event dispatcher to trigger the opening of the door is called. The matinee to open the door  is inside the level blueprint. All matinees are triggered from here, as they are unique to the level. Level blueprints and normal blueprints have to have special logic to talk to each other as they quite different entities. That is where event dispatchers come in - they are essentially telling the game code “listen out for this command - when you hear it, carry out the actions specified by the event dispatcher message”.  We have named our event dispatcher ‘3LockDoorEventDispatch’.
matinee_5.jpg

Over in the level blueprint, the code will always be ‘listening out’ for the event dispatch command. The red node ‘Event BeginPlay’ tells the logic to start being aware of the event dispatcher the moment the game starts to play. We tell the computer what actions to do by binding them to the phrase ‘3LockDoorEventDispatch’. After that we target the actor that will trigger the dispatch which is the door, Following that we tell the computer to finally play the matinee of the sliding door that we created. The animation should then trigger and the door will slide open.

Making an explosion
This is a typical example of how an actor is created. For our grenade actor, we want a specially created explosion that carries its own unique properties. We can set up the blueprint to swap with the grenade actor when the grenade strikes something. The illusion will be of the grenade exploding in a huge burst of flame. Every actor’s creation begins with a small white orb to denote the centre of the object. This is shown below.

~EXP_1.jpg


EXP_2.jpg
Creating a component in a blueprint is a simple task. By clicking the 'Add Component' button and selecting from a menu we can add all manner of behaviours and properties to our actor. These can range from tasks such as importing a mesh from Maya or adding rules for velocity and speed through to creating collision rules or particle effects.
In the case of our explosion, there are only a few simple components we need to add. The first is the particle effect 'P_Explosion' that comes with Unreal as standard. Using the X,Y and Z coordinates we can alter the size of the explosion as well as its overall intensity.
Additionally we can specify the blast radius and the force with which it will propel any objects around it outwards - this is known as a ‘force component’, which we shall be setting up to fire in the next step.
We can also set up a collision box component that will help to identify which objects to destroy that. Any actors that overlap the explosion collision box can be set to be destroyed upon detonation.
One of the most helpful things about components is that by selecting one and clicking one of the green boxes on the right of the screen we can quickly assign individual behaviours to just that component and not the whole actor. For example, with our zombies we have set up a collision begin overlap box for both a grenade and a fireball. With two different collision boxes we can set up two completely different kind of behaviour, even though they both have the same command 'begin overlap'. In the case of our game, we can set fireball to chip away at an enemy's damage or have a grenade take it all and kill the enemy instantly.


EXP_3.jpg
This is a short blueprint that we set up inside the event graph of the blueprint actor. The red node ‘Event BeginPlay’ will trigger whenever the actor is first introduced into the game. If you remember, we specified the grenade actor to swap for an explosion actor on impact, creating the illusion of an explosion. The first thing we want to do in the above blueprint is trigger the effect of an explosion - this is called an ‘impulse’. The impulse needs to refer to the specifics that we set up in the force component, so we set that up as the impulse target. Following this we have a short delay of two seconds so we can play a short burst of the ‘P_Explosion’ animation. After this we destroy the explosion actor, removing it from the game world entirely using the ‘Destroy Actor’ node. If we called ‘Destroy Actor’ before the delay, the explosion would disappear without us seeing any animation. Therefore, the order we call these nodes in is vital to achieving the effect we want to get.
EXP_4.jpg


The above is a short logic sequence determining how the grenade should swap for the explosion upon striking something. We use ‘Event Hit’. To place the explosion at the right point in the landscape, we use the ‘Make Transform’ node, which will make the explosion actor appear at the hit location.


Creating a material in Unreal
A material is basically a coloured paint that you can apply to an object in Unreal. We need to tackle Blueprint material editors, which can at first seem confusing. However, understanding what each value means makes life simpler from the outset. So, here is what each refers to in the editor node:


  • Base Colour - Your colour, chosen from a colour picker. RGB value.
  • Metallic - (Set to 0 or 1) - 0 is metallic, 1 is non metallic.
  • Specular - How shiny a surface is - THIS WILL ONLY WORK ON NON METALLIC SURFACES
  • Roughness - (Set to 0.1 to 0.9) - How much the light is bounced around from a surface and how cleanly - e.g, imagine how shiny a ball of glass is compared to a ball of chalk)
  • Emissive - How much the material glows
  • Normals - Lumps and bumps, identified by a separate bump map texture.


Colours are termed as floating point colours. To simplify this imagine a pixel. A pixel is made up of:
  • R
  • G
  • B
  • A (Alpha) - transparency 0 (0 = 0% transparent) - 1 (1.0 = 100% fully opaque)

Some other terms I refer to:
  • Constant - Hold 1 and LMB
  • Constant 3 Vector (Simple RGB Colour) - Hold 3 and LMB
With an understanding of these terms, we can go about making a material colour that we can apply to an object:




In the Content Browser, we create a new folder called 'Materials'




In the materials folder, right click in the Content Browser. Choose 'New Material'. Name it appropriately. Double click the material to open it in the Blueprint editor.




Here's that editor node! All the terms are above in this post, so consult those if you're not sure. For a simple material, we'll only need to use a few of them.




Hold 3 and left click in a space. The Constant 3 Vector field should appear.




Double click it and choose a colour. Drag a connecting wire from the white pin to the Base Colour pin.




Next, metallic setting. Hold 1 and left click in a space. The constants field should appear.
Set it to 0 or 1 in the value filed under Material Expression Constant on the left. Drag off the white pin and connect it to the Metallic pin.




Next roughness. Hold 1 and left click in a space. The constants field should appear.
Set it to 0.1 or 0.9 in the value filed under Material Expression Constant on the left. Drag off the white pin and connect it to the Roughness pin.




Next Specular. Hold 1 and left click in a space. The constants field should appear.
Set it to 0 or 1 in the value filed under Material Expression Constant on the left. Drag off the white pin and connect it to the Specular pin.




Click save and then click go back to the Unreal editor viewport. Choose your object then just drag your newly created colour onto it to change it's colour.

Wednesday 15 June 2016

Unit 66 - Animation and Modelling Assignment

  BTEC Level 3 Games Design
Unit 66 - P3 Creating a human model in Maya
By Dave Johnson

I have described my process for creating and rigging bipedal 3D models below. The first thing you will need to do is create 2 reference images for the front and side of the head. The two below are ones I created, but feel free to use them. For each, insert a plane polygon, then right click the shape with the mouse. Choose:

Assign new material.
Lambert.
In Common Material Attributes Panel, click the checkered box next to Colour.
Click File.
In File Attributes Panel, click the folder next to Image Name.
Navigate to your image and click OK.
It should appear on the plane.
In Channel Box click Rotate X 90 degrees and / or Rotate Y 90 degrees depending on whether you are setting up your front or side view.
Repeat for the other image .
Line both up with the grid so the grid centre line is down the centre of each image.
Pull both of the grid like this:


Select both by shift clicking them, then in the layers panel, right of screen, click the 'Create a new layer and assign selected objects' icon. It's the fourth one along, furthest right.
Next to the layer name that has been created there will be a blank check box. Keep clicking it until 'R' appears. This means the images are for reference and cannot be selected by accident


Front View


Side view
1) Begin by clicking the 'Create Polygon Tool' under the 'Mesh' and draw around one eye in your front view port. Extrude out the edges of this and delete the face that is covering the eye. Extrude the two edges nearest to the bridge of the nose and pull them to the centre of your grid. After this go to the Duplicate Special menu (click the little square to the right of the option) in the Edit drop down. You should be presented with something similar to this dialogue box:


Set it up exactly like this, with geometry type instance and x scale -1. Click Duplicate Special. This will create a reverse mirror image copy of what you have just done. The beauty of it is all changes and edits you make on one side will also appear on the other. Perfect for modelling a symmetrical face!
You should end up with something like the image 1), so continue to extrude edges downward to cover the nose. Just block stuff in for now, don't worry too much about detail.



2) Continue blocking in the area above the top lip by extruding edges, just as we have been doing so far. Follow the exact pattern of the mesh in the picture if you can.



3) Continue to extrude the muscles downward and around the mouth, and begin to create the shape of the mouth itself.


4) You should end up with something that looks like the picture above.


5) Continue onto the chin. Again, just block it in for now, don't worry about the detail or curvature of it.


6) Now take the edges of the nose and extrude them out. As much as possible you'll want 4 sided shapes, or quads making up each part of your mesh. 5 sided are ok sometimes and even 3, but don't go above 5 sides. If you right click on your object you'll have the option to select vertices. Do this, and select any two vertices you want to merge together. In the case of the image above, you can see that I have extruded out an edge and now want to make it part of the existing mesh by merging vertices. I would do that like this:


You can see I have two vertices in yellow and two in blue. I would select both yellow and click Edit Mesh > Merge. The same for blue. Once in a while, it helps to select every single vertex and just click Edit Mesh > Merge. This makes sure that each half of the image is stitched together by it's corresponding vertices right down the centre. Not doing this will have some weird effects later on, as the computer will think it's got two separate objects and will struggle to understand they are supposed to be joined together.


7) You should end up with an image like the one above


8) Now let's get that side view involved! Set up your viewports so you have just the front and side on screen. You can do this by going to the 4 port view and pulling the dividing line that runs across the screen centre upwards.


9) Begin the next part by concentrating on the centre most line of vertices that runs from the bridge of the nose to the chin. Choose the one that begins  at the top centre of the face (bridge of nose) and pull it in the side view so it matches the profile of the drawing.



10) Continue onwards with that centre line of vertices and you should get something like this by the time you get to the chin.

11) A quick word of advice on the chin. At this stage they are the only vertex that you can pull up and down, to match your profile (see image above). All others, just pull them left and right. You may feel restricted by this and think your image isn't working, but trust me, patience is worth it eventually.



12) Continue onto the next line of vertices out. If you are unsure, look at the coloured chart below, after the central yellow vertices, it shows which group you should work on next. I work inwards to outwards - yellow, then orange, then red (only two of these), them magenta, blue, cyan, green, black, brown.


REMEMBER - WITH EXCEPTION OF THE CHIN, ONLY PULL VERTICES LEFT AND RIGHT.


13) Following the contours and lines of your drawing, you should end up with something like the image above. Notice how the eye socket has begun to shape itself. To get it looking just right, let's drop in a polygon sphere for the eyeball.


14) In front view select the sphere, size it accordingly and hold J whilst rotating it. This is so we can use the polygon pattern in the sphere to give us a good idea of where the pupil would be.



15) In side view move it into place.


16) In perspective view, take the vertices around the eyeball and move them forward or back to follow the curvature of the ball. You want it so that no part of the eyeball can be seen OUTSIDE the face - we want it to sit snuggly inside.


17) In front view tweak the same vertices so they look good and cover the eyeball. Now we are going to work on the lips and mouth. Take all the edges around the inside of the mouth and extrude them inward.


18) Pull them around so the bottom lip is bigger than the top lip. This is a trait in women's mouths, which we are obviously aiming to recreate.



19) Overlap the top lip over the bottom ever so slightly. Remember that whatever you do in the front view must also be taken care of in the side view, so adjust it accordingly. Use the screenshots for reference.



20) Now we want insert another edge loop in the mouth to round it out and add shape. Do this by clicking Edit Mesh > Insert Edge Loop Tool. Insert it where I have inserted mine in the picture above. With the edges or vertices this creates, pull them out to plump up the lips slightly, rounding them out.



21) Now we're going to add shape to the nose, be need a corresponding number of polygons. So add an edge loop under the nose like I have in the picture.



22) Extrude edges out and merge vertices together. At this point, we can also start thinking about the jawline. In side view choose the bottom most polygon edge and extrude it out towards the ear, like the picture in 23).


23) Extrude again at this point and pull up toward the ear.



24) Remember to move things in front and side view simultaneously.


25) Around the chin area, extrude edges and merge vertices to build shape to the face. Also think about the curvature of the chin and the way you arrange your vertices.


26) Keep adjusting and tweaking vertices in side view.

27) Round out the nostrils by selecting each edge around the nostril hole and adding an edge loop. This created more ploys, more vertices and more opportunity to add details to the face.


28) Any edge loop should run the length of the face. Those new ones below the nostril need to continue down the chin, so add one in using the insert edge loop tool. This will also help round out the chin.

29) Adjust as necessary in front or side view.


30) The nose vertices are still quite flat and uniform. Adjust them to create more of a realistic flow around the eye to nose bridge area. Do this by moving the vertices left or right in front view (NOT UP OR DOWN)

31) You've seen how edge loops add detail so add them in wherever you think may be appropriate, but in mind the polygon count and complexity of the mesh.

32) Take the top line of edges across the brow and extrude them upwards to begin to create the forehead.


33) Cover the nostrils by extruding the nostrils edges inwards then pulling them up into the inside of the nose.






Unit 67 - P3 - Animation : Building a skeleton and rigging a model
Using the rigging and IK tools in Maya, we can take a model and set up a realistic set of limb joints that we can manipulate. This will enable us to pose and eventually keyframe animate our model. I have documented the process of building a skeleton within a model below, beginning with us positioning our model in the centre of the viewport.
2.jpg
We begin our process on the left leg. Selecting the animation option in the dropdown menu in the top left, we select the build joint tool on the shelf (seventh icon from left). In the side view we draw 5 connecting joints, from the top of the leg, through the foot,  to the end of the toe. Already at this stage we need to be thinking about where the joints of the model will bend.

3.jpg
We then go to perspective view and shift the joint set across to sit within the leg. We also name each joint:
L_UPR_leg, L_LWR_leg ,L_foot, L_toe and L_leg_end. The convention of naming each joint with ”L_” is vital - it refers to left leg.
4.jpg
Using our joint tool we can draw a separate joint in the centre which we name “Hips”. By first selecting our left leg joint set and the shift clicking the hip joint we can press “P” on the keyboard. This makes the leg a child of the Parent hip joint - if we move the hips joint, the leg will follow.


5.jpg
Now by selecting our left leg joint set and choosing the “mirror joint” options box, we can search for “L_” and replace it with “R_”. When we click mirror, the left leg will be duplicated, flipped and have it’s joints renamed with R_.
6.jpg
Maya will create a copy of our limb and even recognise it as being a child of the parent hip joint

.10.jpg
Next we follow the exact same process for the arm. Starting with the left arm we make a new joint set called L_UPR_arm, L_LWR_arm, L_Wrist and L_hand. Additionally, we also make a joint around the collarbone area called L_collarbone and make our left arm joint set a child of the L_collarbone. This means that when L_collarbone is moved, the entire left arm will follow accordingly.
What we also want to do here is make a number of spine joints that run the length of the back, go through the neck and end in a couple for the head.
We take the spine joint nearest the hip and make it a child of the hip joint parent. This means that when the hip moves, the entire spine will move.
Additionally we also make the L_collarbone a child of the spine joint that is nearest to it. Once again, this means that when the spine moves, the left collarbone and arm will follow accordingly.
13.jpg
Next, just like the legs we select the left collarbone and left arm joint set and mirror it, finding replacing L_ with R_ in the mirrored joint set.

With our skeleton complete, we can now bind it to the mesh by shift clicking both and selecting smooth bind from the animation shelf. Now wherever, the skeleton joints move, our mesh will follow.
 
Next we want to set up our IK handles, which is short for ‘Inverse Kinetics’. This is a single controller that runs from one selected joint to another. It appears in the form of a straight line that runs from selected joint A to selected joint B. When we move either of these joints in or out, the joints in between will be pushed accordingly, creating realistic movements such as elbow joints and knee joints (see picture above). We can also rotate the IK handles to twist our limbs realistically. So to set up our IK handles on our limbs we select:

L_foot and L_UPR_leg and click create IK handle on the tool shelf - Makes left leg IK handle
R_foot and R_UPR_leg and click create IK handle on the tool shelf - Makes right leg IK handle
L_hand and L_UPR_arm and click create IK handle on the tool shelf - Makes left arm IK handle
R_hand and R_UPR_arm and click create IK handle on the tool shelf - Makes right arm IK handle

16.jpg
In terms of how we access those IK handles, we need to make an easier way to access them.We can do this by using our curves tool to draw circular control handles and locking them to the IK handles. An IK handle on its own is a relatively narrow handle that can be a little difficult to select. By selecting a much larger handle, we can access an IK easily, along with twisting and rotating the joints nearest to the control handle. We start with our left foot.

18.jpg
We can then duplicate and create a right foot controller also. On all our controllers we need to click freeze transformations.

20.jpg
The process of attaching each controller is as follows; we select our relevant curve and IK handle (in the case of the above picture, it is the right foot controller and the right leg IK handle). With both these selected we click constrain to point, meaning that the controller curve is locked to the chosen joint. We then select the controller and the relevant nearby joint and click orient constrain, meaning that where the controller goes, the axis follows.

22.jpg
We obviously then repeat this process to create arm controllers, with the curves locking to the hands and arm IK handles.
24.jpg
Continuing on, we create a hips controller, but we do not hook it up just yet.

26.jpg
Click the  hips joint then the joint where arms are attached. Click IK Spline Handle Tool icon to create a curved spine along the joints we have made (much like the create IK handle process). The spline handle tool creates a special kind of handle that contains vertex sub objects. These sub objects can be manipulated to make a  spine bend and curve like a real life one.I will eventually be attached to the spine joints.

We then shift click hips_CTRL and Spine_IK, and click constrain point. Shift click hips_CTRL and Spine_IK_curve and go to orientation constraint hotbox - constrain Z and X axis only. Hit apply.

28.jpg
. The Spine IK curve has features available to it such as roll and twist. We now want to take advantage of the roll (forward and back) channel in the Spine_IK_curve. We do this by doing the following:

Go to Window > General Editors > Connection Editor

Find Hip_CTRL in the outliner
Hit Load Left button in the Connection Editor
Find rotate and expand
Highlight rotateY

29.jpg
Find Spline_IK in the outliner
Hit Load Right button in the Connection Editor
Find roll and highlight

Duplicate Hips_CTRL and rename it Chest_CTRL - snap to spine joint in chest area, that is also attached to the collar bones

Find Chest_CTRL in the outliner
Hit Load Left button in the Connection Editor
Find rotate and expand
Highlight rotateY

Find Spline_IK in the outliner
Hit Load Right button in the Connection Editor
Find twist and highlight
31.jpg
Make Hips_CTRL parent and Chest_CTRL a child.

34.jpg
Make Left_Arm_CTRL and Right_Arm_CTRL children of Chest_CTRL (This means when the chest moves, the arms will follow).

33.jpg
If we wish, we can make a large main controller by drawing a large arrow on the floor and making all our controllers children of the main parent controller.

Setting up skin weights

Skin weights can be adjusted to prevent deforming when the model bends. The heavier the skin, the more rigid it is and so the less likely it is to deform upon bend.

36.jpg
We can begin to test our skin for deformation by moving the IK handles to bend the limbs. We can also rotate the controller handles we set up.

37.jpg
For example, we have discovered that if we set up a head control curve, attach it to the head joint and rotate, the head will begin to distort. To fix this we go to our animation menu and choose skin. From here we select the paint skin weights option hotbox.




38.jpg
The joint list should come up in a menu on the right. By selecting each joint we can see in white highlighted areas just how rigid the skin is on that part of the mesh.If it is more grey than white, the skin is quite malleable. If it is solid white it is rigid. We can use a paintbrush tool to paint in whiter areas on malleable parts or make rigid parts more malleable by selecting ‘replace’ rather than ‘add’ in the menu.


39.jpg
In the picture above you can see how painting the head white has made the skin more rigid, resulting in it not deforming any more when the head turns.







Concept design work
The following describes in brief what my enemy characters were intended to achieve during the animation stage. Due to time constraints and technical limitations, I had to compromise my ideas, but hopefully the sketches below will give some idea of what I had in mind.
1.jpg
The zombies are humans that are infected by a small waterborne parasite. Microscopic and able to travel undetected through water, the small squid like quickly grows inside a human host. One of the main features of the infected human hosts was to have the growing parasite alter them genetically - they will have jaws that open and split down the middle, allowing the many internal tentacles to make themselves known. The above sketch shows the process of how this would happen, transforming an innocent looking person into a hideous creature.
2.jpg
Sketches showing small incidental animation ideas throughout the levels of the game.
3.jpg
Fully grown parasites eventually break through and destroy the human head, replacing it with a squid like mass of tentacles that deadly in combat. The above sketches demonstrate this in action. Note the large tentacle wrapping around the body which plays host to a poisoned barb.
4.jpg
The zombies would feel their way through an environment with outstretched arms, as demonstrated in the above sketches.
5.jpg
Sketches demonstrating how the zombies would attack the player.
6.jpg
A sketch of the way that a fully grown zombies tentacled head will unfurl. Additionally, an idea for the movement of a strange pair of protruding mandibles.
7.jpg
Infected humans will walk with a lurching, awkward gait, very much as if they are being controlled like puppets from the internal parasite.
8.jpg
Infected humans will walk with a lurching, awkward gait, very much as if they are being controlled like puppets from the internal parasite.
9.jpg
A short storyboard showing how explosive barrels will have an impact on the zombies when they explode.  Strategically shooting them will allow the player to take out large groups of zombies and add to the action.



Screenshots of my final models - main character and zombies

fchar.jpg

Armoured female - main character

fchar2.jpg

Armoured female - main character






fchar3.jpg

Armoured female - main character







zom1.jpg
Zombie 1




zom1a.jpg
Zombie 1








zom2.jpg
Zombie 2








zom2a.jpg
Zombie 2








zom2b.jpg
Zombie 2








Links to videos showing animation tests and gameplay animation


Gameplay walkthrough videos:

Level 1 & 2
https://www.youtube.com/watch?v=T2tef8uODtc

Level 3


Animation test

https://www.youtube.com/watch?v=FJoL7EfW0nM&feature=youtu.be