BattlegroundIn search of a systematic way of testing the combined Dreamspace...
On Your WayIn terms of shot complexity and number, On Your Way was a quite ambitious...
SkywritersProject ‘Skywriters’ is a Filmakademie documentary film about a family...
CamdroidOnce more the company Camdroid tries to revolutionize the filmmaking...
In search of a systematic way of testing the combined Dreamspace Demonstrator in a structured manner, the partners came up with the idea of creating a futuristic environment for a TV show with user interaction settled in a science-fictional genre. Showcasing the advancements of the Dreamspace technology can be done best in a focused environment, built under the premise of real-world scale productions while being set up exclusively for the evaluation purposes.
Battleground is the ultimate TV game show competition and viewing experience. In each one hour episode, contestants compete to see which team has the best bot, agility, weapons, defenses and team strategy to defeat the other competitors. Battling bot teams will compete on a green screen motion capture stage wearing mocap suits while watching a large projection screen. Utilizing real time motion capture, virtual environments and game engine interaction, each team must coordinate their actions (and virtual bots) to maneuver through dangerous environments while defeating the opposing bots.
To show the production value and to test the workflow internally, the resulting imagery of the Battleground production includes a behind-the-scenes as well as high-polished shots from our VFX pipeline. In the context of the delivery we tested most of the Dreamspace technology in combination. The Battleground production is the final production applying the final Demonstrator.
ERROR: Content Element with uid "5260" and type "vimeovideo_pi1" has no rendering definition!
In terms of shot complexity and number, On Your Way was a quite ambitious project. In order to reduce the amount of shots tying up resources in postproduction, the student team decided to use rear-projection whenever possible, taking the Ncam tracking data to compensate for the flat projection surface in a perspectively correct way. Thus the amount of VFX shots could be reduced tremendously. Half of the shots were still shot against blue screen, offering the possibility to address Dreamspace developments, such as Live View, Ncam depth and VPET.
Up to the point of shooting, the production could be considered the biggest use case for Dreamspace related technology apart from integration tests, as it applied not only single components in isolation but tried to make use of the entire pipeline suggested by the Dreamspace consortium. Ncam was used to gather camera tracking data, that was streamed to Live View and to several VPET clients alike. The decision-makers were able to manipulate virtual assets by transforming them via VPET, while the results underwent a real-time compositing and rendering process for being previewed on monitors on set. Additionally, depth data was recorded.
Project ‘Skywriters’ is a Filmakademie documentary film about a family business for sky advertisement. In addition to the real filmed footage the team approached the Institute of Animation at Filmakademie for additional computer generated shots. This collaboration allowed the evaluation of the current status of the Dreamspace Live View system integration in combination with ncam real-time camera tracking and the VPET tools. The ncam system was mounted to a virtual camera displaying its views on a 9’’ screen and simultaneously on a large beamer. The director had no previous experience with virtual production technology. After a short introduction, he was able to direct digital assets and animation using a VPET tablet in agreement with the director of photography to design the shots. Time consuming scene preparations and manual animation data alignment were existing limitations at this point of system integration in December 2015. The setup was evaluated as very intuitive and with a high potential to increase creativity.
Once more the company Camdroid tries to revolutionize the filmmaking business with a ground-breaking product: the autonomous camera robot CamBot Mark 2. However, this leaked video reveals some dramatic deficits.
The short film 'Camdroid' has been produced in a virtual production environment at Filmakademie, combining motion capture with live camera tracking via ncam and an early prototype of set editing tools.
An interface prototype has been built, which enables both the director and the cinematographer to interactively improve the scenery and receive high quality feedback in real-time. A head-mounted display allows a mobile exploration and therefore, more natural perception of virtual elements. A gesture recognition sensor offers the possibility to instantaneously relocate and manipulate objects and lights with simple gestures.
Watch the final film here.
Students creating the animated short movie Song of a Toad had the unique opportunity to embrace latest developments of the Filmakademie research department in the scope of the EU funded project Dreamspace within their creative filmmaking process. An experimental setup combined a Nuke real-time compositing prototype developed by Foundry with a live puppeteering device steering one of the protagonists. The puppeteering device was invented by the students Kariem Saleh, Florian Greth and Amit Rojtblat out of the scope of project Dreamspace. This allowed the students to discuss and iterate various animation layouts within an established compositing workflow.