OUR CHALLENGE
With the capture suit porting data through UE5, you will get the basic function like your character's arms and legs can be driven by the actor. If the character's measurement is exactly the same as the actor's, the capturing results look good. but when the character got long legs or a short body, any different proportion between the actors, it won't work very well, just like the picture I showed below, the character's root is floating, the legs slipping all the time and the hands will cross through the character's body. that's not the final animation production needed.
Without Procedure animation
There are some plugins like IKinema, which is a very professional third-party plugin that can solve the conversion problem. the V-tuber Xanadu used to use it and looks very well. But IKinema stops serving, also can't match up with every brand of capture suit, only Xsense as far as I know, and even no version for UE5 which we are developing on. the other Plugins have the same problem. So we decide to develop our own transform procedures which are fast and compatible with all types of capturing. either optical capture system or inertial capture system theoretically. We call it Procedural Animation System.
With our Procedural animation the result is good
THE PROCEDURAL ANIMATION PIPELINE
The main idea is to create a third invisible body inside our character to store the original data received from whatever capture system software, even the data from DCC tools is also available. and then convert the stored data to different proportion rigs of the character. we got a lot of spirit from the Fornite Project. But it's totally different. Fornite was sharing a repeated animation from the main rigs, you can add more partial rigs on it later. But we need to convert the rigs to a totally different rig, and the animation is also in real-time, nonrepeated. So the "copy Pose" mentioned in the video will not work. we need to transform the real-time pose somehow in the character's blueprint. In this blog, I will explain how we did it.
1. Creating invisible rigs in the character's Blueprint of the UE5, to receive real-time data.
In your character's blueprint create another skeleton and set it as invisible. better to make sure it is the same skeleton measurement as in the capture software(we use Rokoko in this case). And then create an animation blueprint for this invisible guy and hook it up in the character's blueprint. You can keep the invisible guy's mesh or just skeleton for a fast calculation. we'll keep the mesh because it's easy to debug when we are running it.
2. Transform the data proportion
In the screenshot above you will notice the hips height between the character and the invisible guy is far away different, and also the other rigs. we normally want to reserve the original rigs' orientation and replace the feet and hip positions. but when you change the placement of the feet, the orientation must be different from the original, so we need another IK control, feeding the orientation data first and then modifying the feet's position to the right direction. we will talk about the IK things later.
we found that positions will get much more influence than the orientation by the different proportion. so we simplify the skeleton under the hip. Typically there are 3 joints very important, the root and two feet. we extract those 3 points and calculate, then feedback the final data to the control rig to do the IK things.
Before that make sure that the converted positions are stored to the character's parameters in the blueprint, we will call them back in the animation blueprint and control rigs' blueprint. you can do the same thing to the hand if you want the hands to get an accurate position.
3. Call back parameters in the animation blueprint to prepare the next step for ControRigs
ControlRig is the main part of it, it will use inverse kinematics to fix the legs placement. and we got a surprise that it can not only fix the legs placement, but also make more lively animation for us, make it much like a cartoon's character, and much more interesting.
check the blog Procedural animation 2.
Commentaires