A couple of weeks ago Fabric Software abruptly ended the development of Fabric Engine without any following announcements. In this two-post series, I’ll try to go over what Fabric Engine was, the different positionings it assumed through the years, and what voids does it leave in the CG community. Bear in mind these are my own personal opinions.
Update, check the second post in this series: Fabric Engine and a Void in 3DCC Machine Learning.
In the beginning, there was KL
However, why was it any good for CG folks? While faster is better, one needs to be able to create stuff. In addition to barebones KL, FE shipped an extensive set of KL libraries that made possible to load, edit and display, kinematic and geometric 3d data. C++ code could be KL wrapped, giving access to code libraries like OpenCV and devices like moCap equipment. FE was integrated to PyQT and had its own custom Python Scene Graph.
So, Fabric’s initial value proposition was: “[i]t’s a framework to build tools” (https://goo.gl/v3FCED). This is corroborated by their initial example projects: surface painting tools, muscle simulation tools, an asset explorer, and so on.
The Engine approach
Apparently building standalone tools was never really a thing. Fabric Inc. eventually scraped the Python DG, and the PyQt integration out of the software (PyQt came back later). Fabric grew in a different direction, integrating itself inside Maya, Softimage, Modo and later 3ds Max, Rhino, and Unreal. It was running as an Engine inside these platforms.
The new approach implied two different value propositions: (1) multi-thread and accelerate parts of the evaluation inside your current environment; (2) guarantee you are evaluating the same thing across different platforms (leading to portability).
At the time (circa 2013) Maya did not have a parallel evaluation graph, nor did any other DCC. So, I guess one could use Fabric to highly optimize a rig; but having to re-implement many of one’s DCCs basic components for some performance gain is a trade off I feel only big studios could have made (like DNEG showed us in SIG2016: https://goo.gl/SmruhP).
Now the proposition of portability is a whole different thing; it is desirable for a more extensive crowd. If done well it could future-proof assets, and free artists to choose the interface in which they would prefer to manipulate the assets.
Problem is the high-level components one would expect, for rigging at least, were not there in the beginning. Plus, a more in-depth integration to each DCC would be needed for one to manipulate rigs with the packages standard tools. These issues were only tackled two years later with Kraken, and even than Kraken was a Maya thing only (if you disregard Softimage, a defunct DCC).
To top it all of, SideFX launched Houdini Engine just a while later (https://goo.gl/isgS4N), with a similar proposition of asset portability.
Ease of Use vs. Speed
Parallel to its foray into DCC integration Fabric became a lot more user-friendly with the development of Canvas. Canvas was a visual programming interface to KL. One could now write KL with nodes, without losing performance.
In reality, some parts of Canvas were not “as multithreaded” as their KL counterparts, but people were always assured that the pieces were there for it to be. Apparently, ease of use was a development priority, which was great in my humble opinion. Prototyping in a visual environment can be quite productive, even if one ends up writing some components in code afterward.
Killer Apps and Features
Through its lifecycle Fabric Software tried to excite people with some ‘killer apps’ that could leverage the platforms’ strengths. Some old ones come to mind like a vegetation tool called Flora, and a crowd package called Horde. These were not sold to the public as a products.
More recently the company came out with the concept of Asset Patterns, a ‘procedural process for importing or exporting data into or from a host application’ (https://goo.gl/NZhxHX). A fascinating step towards portability, Asset Patterns were a building block for a nice bridge Fabric was building across Rhino, Maya, and Unreal. Sadly this bridge may have been outshined by Unreal’s own Datasmith (https://goo.gl/EeEHYL).
The end of development and distribution of Fabric Engine may be far more impactful to a handful of companies that relied on the technology than to individual TD folks. However, while things are different today than they were in 2010, the end of the platform still leaves a significant void.
A ‘code once deploy everywhere’ visual programming tool for writing custom, c++ fast, parallel ready, GPU enabled rig components is nowhere in sight. Even basic rig components are not standardized across DCCs from the same vendor; so, without using caches, one cannot expect to port assets more complex than Xfos deforming a geometry.
Another void FE leaves is in Machine Learning (ML). I see Machine Learning (ML) as potentially impactful technology in the improvement of the animation and FX pipelines. Fabric was a great environment to bring ML prototyping to a large TD audience, but I’ll go over the specifics of this topic in the following article.
Since no announcements were made regarding what happened to the company and product, many have tried to guess. Bankruptcy? Acquisition? Acquihiring? For the fun of it here is my guess: some client (or group of clients) who were highly invested in the technology purchased it for internal use.