Style Invariant Locomotion Classification for Character Control

publications, research and development

Abstract: We present a real‐time system for character control that relies on the classification of locomotive actions in skeletal motion capture data. Our method is both progress dependent and style invariant. Two deep neural networks are used to correlate body shape and implicit dynamics to locomotive types and their respective progress. In comparison to related work, our approach does not require a setup step and enables the user to act in a natural, unconstrained manner. Also, our method displays better performance than the related work in scenarios where the actor performs sharp changes in direction and highly stylized motions while maintaining at least as good performance in other scenarios. Our motivation is to enable character control of non‐bipedal characters in virtual production and live immersive experiences, where mannerisms in the actor’s performance may be an issue for previous methods.

Full paper:

Neural Networks and Animation

research and development

I’ve recently published, as an open source project, the code for feNeuralNet. This is a Fabric Engine extension for doing simulation of previously trained Neural Networks. Fabric Engine is a plataform for CG development that works standalone but is also integrated in many animation packages like 3ds Max, Maya, Softimage, and Modo. Yes, that means you can now play with neural nets in any of those packages.
If you are wondering what good is having neural networks in an animation package, so am I! On a more serious note, machine learning is enabling people in many fields to come up with computational solutions for problems that were previously very hard to solve… I am curious to see what real applications can emerge in animation and vfx pipelines in the next few years.

Check out the product page!

Processing moCap data for Machine Learning with Fabric Engine

research and development, training

In this three part video series I show the challenges in pre-processing motion capture data for machine learning and how one can go about this task. I use Fabric Engine as a development tool for this task. If you find this video helpful or if you have further comments or questions on this topic, please, leave a reply bellow.

Video 2:
Video 3:

A thread from the Fabric Engine forum:

Non-Humanoid Creature Performance from Human Acting

publications, research and development

In this work we propose a framework for using human acting as input for the animation of non-humanoid creatures; captured motion is classified using machine learning techniques, and a combination of preexisting clips and motion retargeting are used to synthetize new motions. This should lead to a broader use of motion capture.

This work was presented as a poster at SIGGRAPH 2016. Click here for the full publication.

Motion Tools

research and development

Motion Tools is a small collection of tools that aims to aid Motion Graphic work being created inside Softimage. It does so by providing many ICE compounds and partially abstracting the ICEtree construction processes. It eases the creation, procedural animation and simulation of many many objects or chunks of geometry.

Confira a pagina deste produto!

motion tools v0.4

research and development

Motion Tools is a small collection of tools that aims to aid Motion Graphic work being created inside Softimage. In this latest release the capability of controling polygons and polygon islands with particles was added. Therefore enabling one to control this elements with regular ICE nodes, or Motion Tools’ compounds.
Some high priority bugs and workflow enhancements were tackled, although, due to time constraints these improvements were not akin to the initial intention. I hope this still may prove useful to some.

Download (right click, or drag into Softimage):
Softimage 2013 – MotionTools/v0.4/2013/MotionTools.xsiaddon
Softimage 2012 – MotionTools/v0.4/2012/MotionTools.xsiaddon

Release Notes: