Animation Rigging

This section is about the rigging of our 3D bodies to the VR player.

Looking into Animation Rigging

One of the first things we need to do is get our chracter model moving along with our player so that it looks like we are controlling the character. Unity has a package called "Animation Rigging".

"Animation Rigging allows the user to create and organize different sets of constraints based on the C# Animation Jobs API to address different requirements related to animation rigging. This includes deform rigs (procedural secondary animation) for such things as character armor, accessories and much more. World interaction rigs (IK, Aim, etc.) for interactive adjustments, targeting, animation compression correction, and so on." https://docs.unity3d.com/Packages/com.unity.animation.rigging@0.2/manual/index.html

It sounds like this is something we can really use for animating our characters.

Implementing the plugin

After installing the plugin it was time to start using it.

The first thing i did was look for tutorials on how to use animation rigging and i came across a few decent ones.

This helped us out a lot. It explains how we can use some of the package scripts to generate the bones, and use constraint targets which we can use to change the position of the hands and head to our vr rig. 1 Important thing to mention is that the character we are using needs to be rigged otherwise there will be no bones that can be targetet. This was something that we let Mixamo handle.

A few things we need to look at is the way Animation Rigging is built and what we need.

As we can see there are a few components that we need to take into consideration while doing this, we need an Animator (which we already have on our character model), a Rig builder which we will be adding to our character, a Rig which will be placed on our constraints object which i will be creating soon enough and the constraints themself.

Setting up Animation Rigging

First of all we need to add a Bone Renderer to our character to be able to use a list of all the bones from our character rig. By adding this script it shows us all of the character bones in the unity editor.

we now just have to link all of them to the bone transform list.

Next we needed to create the constraints we were going to use for the hand and head targets.

In our current character gameobject i created a empty object, called it VR_Constraints and added 3 targets for the left/right hands and one for the head. Also i add another 2 per hand. One for the hand target itself which is used for hand placement. The other is used for the elbow target.

Now that we have this created we also need to add a Rig builder to our character model and add these constraints to the component in the inspector.

After placing all of the constraint targets to the proper position we still needed to add a few more components before it was time to start working on the vr rig animation itself.

The first component we need to add are on the hands. These are both "Two Bone IK Constrain" components which we use for connecting the arms to the hand constraint targets.

The next one is a "Multi-Parent constraint" which we will be using to connect the head bone to the head constraint.

Creating our scripts

Now it is finally time to start writing our own script to track the animation of the vr player.

I this script is declare a load of variables that we will be using for tracking. First i create a class called VRMap.

This is used for setting the position and rotation of each vrtarget to the constraints we previously created. The isPossessed boolean is used to detect weather or not the body is possessed and therefore able to be controlled by the player.

Here i in the SetRigPosition i set the position and rotation of the character and map the vr targets by using the VRMap method created earlier.

Now we still have to connect everything in the inspector and set the values. First we connect the head target and constraint, the tracking position is used to tweak the position and the rotation we can use to rotate the head bone, in our case this was way off and needed a fair bit of tweaking.

The same goes for both arms and we can set the head constraint value to the head constraint

As you can currently see there are a few small issues. The first is that we still see parts of our head from the inside. By creating a list of serailizefield gameobjects we can add these objects from the editor and use these to activate and deactivate whenever we inhabit or leave the body we want to posses.

This was easy enough to accomplish as I had already kept track of the objects earlier. By turning off the objects in the head locally the player is not able to see their own mouth/head but other players can still see them.

IMAGE......

Afterwards it was time to track our network player and no longer our local player so i had to work together with Tirso to get the tracking right. I updated the targets to be on the rig of our bodies that we can inhabit. This works as follows, each body exists our of a rig, a character model and a skeleton with constraints.

The vr-rig is the vr controllable player with hands and feet that are used for interactions.

The Skeleton is the mixamo skeleton that has all of the bones that the animation plugin targets. and the constraints we use for tracking the hands (and also the feet)

And finally the model is nothing more than the meshrenders that are used to see the model.

The final product started to look a little something like this!

Feedback and iterations

During the progress showcases i recieved numerous feedback. One of the things was to change naming conventions of targets as they looked way too much like one another which made it difficult to tell the difference between them. Another was to work more towards a high fidelity look as we were nearing the ned of the project and it was time to show what it would look like in the actual end product and not just a sample/test scenario.

As for the naming conventions of targets, this was changed to look more like the following so that it would be easier to differentiate between targets.

Looking back

If i look back now at how i made our animation rigging it would of been better for me to ask for help sooner. Rigging the model to animate the arms and hands was not the issue here, however tracking these over the server with networking turned out to be way harder than i had expected and i should of asked for help from people that had more experience. (or had worked on the networrking side of things more than i did.) I did learn a new skillset here which is how to rig a model to have targets that are used when when animating for vr, but if i had asked for help from Tirso sooner for example i might have had more time to work on other things and help polish the project more towards the end.

In conclusion, dont wait too long to ask for help from others.

Last updated