SAUCE

SAUCE (Smart Assets for re-Use in Creative Environments) is a three-year EU Research and Innovation project between 8 companies and research institutions to create a step-change in allowing creative industry companies to re-use existing digital assets for future productions.



SAUCE will research, develop, pilot and demonstrate a set of professional tools and techniques for making content ‘smarter’, so that it is fully adaptive in a broad, unprecedented manner: adaptive to context (which facilitates re-use), to purpose (among or within industries), to the user (improving the viewing experience), and to the production environment (so that it is ‘future-proof’). The approach is based on research into

 

  • light-field technology
  • automated classification and tagging using deep learning and semantic labeling to describe and draw inferences
  • the development of tools for automated asset transformation, smart animation, storage and retrieval.

 

These new technologies and tools will show that a vast reduction of costs and increases in efficiency are possible, facilitating the production of more content, of higher quality and creativity, for the benefit of the competitiveness of the European creative industries.

 

 

More information can be found on the official project webpage.

This project has received funding from the European Union’s Horizon 2020 Research and Innovation Programme under Grant Agreement No 780470.

Lightfields

Throughout the SAUCE project multiple shootings with the lightfield camera rig are realized. A first experimental production shot with University of Saarland Lightfield rig at Filmakademie Baden-Württemberg studios took place in October 2018. The exemplary lightfield production ‚Unfolding’ was shot with the same rig at a studio of Saarländischer Rundfunk in January 2019. Postproduction is starting in early 2019. Lightfield data will become available in 2019.


The SAUCE Lightfield Elements Shoot example data captured October 2018 contains e.g. walkcycles and fire / smoke elements.

 


Download

FMX

FMX

Semantic Animations

Semantic Animation describes using high level descriptors (such as verbal commands) to direct the performances of special character assets which are capable of determining their own detailed low-level behaviours and scene interactions. The required semantic information can be created in a manual authoring process or by machine learning. It includes our work in both virtual production and physically based animation.

 

Objectives

  • To develop and demonstrate real-time control systems for authoring animated content using smart assets, automatically synthesising new scenes from existing ones and integrating smart assets into virtual production scenarios with editable cameras and lights
  • To test the prototype technologies and tool-kits in a series of experimental productions and evaluate their performance in realistic contexts of professional use.

 

In this context Filmakademie has crafted and released the PHS Motion Library. it contains over an hour of bipedal reference motion capture data and videos. The data-set was created with particular focus on emotional variations of walk cycles.


Download the PHS Motion Library


Semantic Character Animation
is realized with dedicated focus on virtual production scenarios.  Filmakademie developed an open character streaming protocol for its VPET toolset. The entire character (including weights, skeleton etc.) can be transferred to the tablets at run time. The newly developed API then allows arbitrary external animation solving engines to animate the character through streamed bone animations.

 

This interface extends the open architecture of VPET. Although the client is based on Unity, any host (providing the scene to the VPET clients) could be connected. We have demonstrated this with Foundry Katana in the past. The latest addition allows to connect an arbitrary animation engine.

 

Semantic Scene Understanding allows to extract labels for (SMART) assets of a scene by the means of machine learning. They can be used to prepare the assets for their dedicated production scenario or to dynamically create character animations.

Filmakademie released the Love & 50 Megatons - Miniature City Assets. This 2020 VES nominated production made heavy use of a virtual production scenario and scanned miniature assets.

 

 


Download Love & 50 Megatons Assets

Publications, Presentations & Press

Recent publications, presentations and press coverage.

 

  • SAUCE Project: Light Fields for Movie Productions
    New (Visual) Media Formats at NEM Summit 2019. May 22nd & 23rd 2019. Zagreb. Croatia.)
    Simon Spielmann

  • Light Fields for Movie ProductionsTools of Tomorrow: Light Fields Track
    FMX2019. May 03 2019, Bertha-Benz-Saal, 15:15 - 16:15. Stuttgart, Germany
    Thorsten Herfet (Saarland University), Simon Spielmann (Filmakademie), Jonas Trottnow (Filmakademie)

  • European Union project SAUCE – Project outline presentation and Q&A session at FMX2019
    April 30 2019, Steinbeiss-Saal, 15:30-17:30. Stuttgart, Germany
    May 02 2019, Steinbeiss-Saal, 11:30-13:30. Stuttgart, Germany

  • Nano 3-Sat (TV coverage)
    Tuesday April 30st 2019. 18:30

  • Tagesschau (TV coverage)
    Tuesday April 30st 2019. 12:00

  • SWR aktuell (TV coverage and online)
    Tuesday April 30st 2019. 19:30

  • SRF: Diese Technologie wird die Filmproduktion revolutionieren (radio & online coverage)
    Wednesday January 16th 2019

  • SR Aktueller Bericht (TV coverage at 17:40)
    Monday January 7th 2019. 19:20