Abstract: We present a real‐time system for character control that relies on the classification of locomotive actions in skeletal motion capture data. Our method is both progress dependent and style invariant. Two deep neural networks are used to correlate body shape and implicit dynamics to locomotive types and their respective progress. In comparison to related work, our approach does not require a setup step and enables the user to act in a natural, unconstrained manner. Also, our method displays better performance than the related work in scenarios where the actor performs sharp changes in direction and highly stylized motions while maintaining at least as good performance in other scenarios. Our motivation is to enable character control of non‐bipedal characters in virtual production and live immersive experiences, where mannerisms in the actor’s performance may be an issue for previous methods.
Presented at the 31st Conference on Graphics, Patterns and Images (SIBGRPI) – Awarded with an Honorable Mention
Summary: Modern motion capturing systems can accurately store human motion with high precision. Editing this kind of data is troublesome, due to the amount and complexity of data. In this paper, we present a method for decoupling the aspects of human motion that are strictly related to locomotion and balance, from other movements that may convey expressiveness and intentionality. We then demonstrate how this decoupling can be useful in creating variations of the original motion, or in mixing different actions together.