Creative Tech

Emotion Engine

Self assigned
Study Case

Emotion Engine

Work
Concept Realtime VFX Unreal Niagara, Blueprint, Shader, Sequencer, UMG Python OpenAI Elevenlabs
Client
Self assigned
AI powered realtime emotion tracker with voice interface

Cinematic Trailer

Summary

What if an AI has direct access to our emotions? What will it make from it? How can this (often ambiguous) data be visualized? Over the past months I have spent my sparetime to work on a speculative experience to answer these questions. The result is EMO//ENGINE. 💀 Each base emotion (Joy, Fear, Sadness, Anger, Disgust, Surprise) is visualized with the help of custom shaders that morph and behave different based on the emotion input. Emotions are captured in "realtime" using a tracking system based on a deep learning method. The raw emotion data is interpreted by GPT3.5 using the OpenAI API and the response is processed into human voice using the ElevenLabs API. It was mostly implemented using Epic Games UE5.

No items found.

Other Projects