TRACER is a software agnostic communication infrastructure and toolset for plugging open-source tools into a production pipeline, establishing interoperability between open source and proprietary tools, targeting real-time collaboration and XR productions, with an operational layer for exchanging data objects and updates including animation and scene data, synchronisation of scene updates of different client applications (Blender, UE, Unity, VPET ...), parameter harmonisation between different engines/renderers, unified scene distribution and scene export which stores the current state of the scene. Furthermore, it can act as an Animation Host, to support XR and Virtual Production high demand to be able to use, exchange and direct animations in real-time environments. TRACER can be integrated and interact with any GameEngine (e.g. Unreal) or DCC (e.g. Blender, Maya, ...) through the provided open APIs and protocols. TRACER is fully open-source and can be obtained from our GitHub repository that also includes VPET, AnimHost and DataHub.
VPET – the Virtual Production Editing Tools - are a development from previous research projects. During the pandemic, an effort was realised to make the fundamental functionalities available for more general XR applications resulting in the TRACER FOUNDATION. Through a re-factoring and reimplementation, functionalities and Implementations of VPET have been transferred into a modern, modular and extendible framework. VPET 2.0 now serves as a sample implementation in the TRACER FOUNDATION framework.
TRACER FOUNDATION features:
‣ An open transfer protocol to provide 3D scenes coming from any DCC, game engine or XR application
‣ Time-synchronised scene updates for real-time collaboration and multi-user applications
‣ Generalised parameter representation and synchronisation
‣ Network communication through netMQ/zeroMQ
‣ AR support (incl. Tracking and Markers)
‣ Touch UI (Buttons and Navigation Gestures)
‣ Modular, extendible, module-based architecture
And much more...
The TRACER Foundation itself is developed in C#, thereby it is well suited to target any Desktop, VR or XR platform.
Within the MAX-R Project extensions to the TRACER will be realised: AnimHost and DataHub.
DataHub will serve as a software agnostic communication infrastructure and toolset for plugging open-source tools into the pipeline of Virtual Productions and XR productions in general, establishing interoperability between open source and proprietary tools, targeting real-time applications.
At the same time, DataHub has the ability to store incoming data and act as a scene server. This makes it possible to store and load complete scenes, as well as their changes over time.
With technologies like marker-less, video-based motion capturing and AI-generated character animation, pipelines for animated movies are transforming. Movie Productions utilising game engines for rendering demand for interactive and real-time animation directing possibilities that can be driven by artists and directors. While e.g. AI-based human character animation generators exist in research, their applied usage in animated movie production is sparse. Integrating such solutions in industry standard DCC applications, game engines etc. is a lengthy process, especially as this is a field where advanced and improved solutions arise frequently. Additionally, the interactive nature combined with the demand for artist controllability asks for new user interfaces and pipelines.
AnimHost as part of the TRACER ecosystem will address these challenges. It will connect animation generators (such as AI deep neural networks trained on motion capturing databases, video-based, low-cost motion capturing and many more) to DCC applications, on-set tools like VPET or renderers in general. It will be functionally independent of the animation receiving app and provide an intuitive interface to support new solvers, with a focus on real-time scenarios.
AnimHost will address compatibility issues between generators and consumers by offering a defined API and data structure for streaming human bone animations into arbitrary receivers. Generators will be integrated into AnimHost as PlugIns. A simple node graph editor will be used as a user interface to set up the animation pipeline within AnimHost. Generators get connected to the streaming node, while needed transformations or eventually re-targeting can be added in the middle. But AnimHost is not a one-way street. It can not only stream automatic generated or captured animations to multiple devices simultaneously through the DataHub, but will also accept user input such as game like character control or setting target locations for character movements. Such user input will be possible e.g. with the VPET tablet tool also being able to receive the streamed animations. Additionally, it is planned to also be able to provide an abstract representation of the scene to the generators that make automatic obstacle avoidance or interaction possible. Thereby it will be possible to generate animations automatically and permit interactive directing/blocking by non-professionals and experts alike. The development will be engine-agnostic, allow high-level interaction with a character in real-time, and be adaptable to any future DCC application.
As part of the EU-funded project EMIL, specifically the lighthouse FATE OF THE MINOTAUR, the Filmakademie Baden-Württemberg utilised their TRACER framework for a sample implementation, demonstrating how TRACER can be applied to create a location-based experience.
Based on a Unity implementation and a TRACER snapshot from 14 September 2023, we developed a short tutorial, the “Location Based Experience Example”, showcasing how TRACER can be used in location-based experiences. Since TRACER is engine-agnostic, this work can be used as a template for implementation in other engines or digital content creation (DCC) tools.