• A-
  • A+

Emote

Emote is a new way of creating superior animated messages for mobile ...

SPLAT!

SPLAT! is a dedicated real-time rendering environment with an emphasis on...

StereoBottic

StereoBottic is a game specifically designed for an autostereoscopic...

Frapper VP

Frapper VP picks up the latest trends in Virtual Production by providing...

EmoCow

EmoCow - the emotion control wheel is an alternative input device for...

Real-Time Eye Shading

This video demonstrates a complex realtime Eye-Shader developed for the Fr...

Real-time Character Dynamic Animation

The video clip demonstrates the real-time animation possibilities of the ...

Emote

Emote is a new way of creating superior animated messages for mobile devices. Using text and emotions an individual animated short message can be created and shared with friends worldwide.

Emote is based on the Filmakademie Application Framework: Frapper. The project is evaluating Frapper’s backend capabilities for Cloud Computing. By utilizing state of the art graphics acceleration and the development of a dynamic animation logic the system achieves “Ueberechtzeit” Render Performance (TTS Engine powered by SVOX).

To get involved with the beta testing of Emote and share with your friends go here.


SPLAT!

SPLAT! is a dedicated real-time rendering environment with an emphasis on Non-photorealistic Rendering (NPR) for animators, researchers and others intent on exploring new looks, both still and temporal, as well as the perception thereof. It proposes a re-ordering of 2/3D processes, a pursuit of computational prowess in service of artistic input and the potential synergy of connecting artists and graphics researchers. Focus is on community involvement and the development of an artistically intuitive, technically powerful interface built upon Frapper.

SPLAT! features:

  • Flexible real-time rendering and compositing environment
  • Reuse of exisiting shading nodes
  • Simple geometry and animation import via Ogre plugins for common DCC applications

SPLAT! has its roots in the aQtree cooperation project between University of Konstanz, Brainpets GbR and Filmakademie. The results of this cooperation are released as aQtree Frapper Nodes. A variety of animation film projects have been realized using the aQtree algorithms. A future cooperation with University of Konstanz will pursue on the development of new NPR techniques.

In a previous workshop at Filmakademie Romain Vergne and Pascal Barla from INRIA in France helped implementing their LightWarping Technique. Yet another example of the versatile usage of the Frapper platform.


Examples

StereoBottic

StereoBottic is a game specifically designed for an autostereoscopic display. The game currently supports displays manufactured by Tridelity AG. However, if you don't have an autostereoscopic display, you can still play the game on split-screen or by utilizing two monitors.

StereoBottic is a two player co-op game, where both players steer a robot together over a spherical planet full of lava, traps and platforms. The players need to press only a single button - if the first player presses the button, the robot moves forward and if the second player presses the button, the robot jumps.

The autostereoscopic display enables both players to see stereo and allows each of them to preceive a slight variation of the world in terms of different content. Therefore, the players also need to talk about what they see in the world and how they should react on what they see.

The game is also a proof-of-concept for using Frapper as a platform for interactive real-time applications. It is part of the current Frapper development branch.

We are happy to announce, that the full game including all assets can be downloaded via our Wiki. The original assets, which were created with Autodesk Maya, are released under Creative Commons and can be downloaded here.


StereoBottic showcased at FMX 2012

Frapper VP

Frapper VP picks up the latest trends in Virtual Production by providing innovative tracking and performance capture tools within the Frapper framework.

We have wrapped the PTAMM library from Oxford University to augment a video stream with a preanimated sequence for on set visualisation. The versatile use of the Frapper framework allows us to rapidly create new production scenarios as shown in the video where we combine the markerless matchmove with the Microsoft Kinect performance capture capabilities.


EmoCow

EmoCow - the emotion control wheel is an alternative input device for facial animation.

The video demonstrates the rapid prototype capability of the Frapper framework. EmoCoW is a compact tool, written in Qt / C++ and was implemented into a simple Frapper-Node to make use of the build-in sophisticated facial-animation abilities of the application framework.

To know more about this project check the paper "EmoCoW - An Interface for Real-Time Facial Animation" is our Publications section.

The Emotional Color Wheel

Real-Time Eye Shading

This video demonstrates a complex realtime Eye-Shader developed for the Frapper Agent Framework.

The lighting considers physical effects like multiple refractions and subsurface scattering. For example the color of the iris is calculated based upon the pigmentation. Furthermore a shader-based tear-simulation is presented.


Real-time Character Dynamic Animation

The video clip demonstrates the real-time animation possibilities of the Frapper framework. The stairs are created randomly during runtime. A state-machine controlles the character so that the animation automatically adapts to this change of the environment.