Using face tracking AI to drive Unreal Engine cameras

So much great tech is now available for artists and technologists to assemble in novel ways. We all have the means to invent the grammar of tomorrow’s language. In this case, can we make the interaction with virtual spaces more natural and intuitive? The clip below shows a prototype that uses an AI face tracker to move a virtual camera in Unreal Engine. The Unreal camera moves left and right when I move my head left and right, and I can rotate the camera by turning my head back and forth. Moving forward and backward is achieved by moving my head towards or away from the camera.

This setup uses the OAK-D AI assisted computer vision camera. The camera supports Google’s Mediapipe library of edge AI models and has a robust python API which is used to capture, calibrate and stream the data to Unreal Engine in real time. Using the OSC protocol which makes the data available in blueprints via a plugin, the information is applied to the camera transforms and to a UI overlay that provides the necessary visual feedback. While not necessarily designed for permanent installations, this setup makes a very efficient toolset for rapid prototyping of ideas, particularly since Mediapipe includes many other models such as full body pose estimation or hand gesture recognition amongst others.

Bringing VFX environments to an LED volume stage

Unreal displaying the environment on the LED wall of a volume stage

Relentless Play put together a presentation for Turning Point Productions to provide a big picture overview of virtual and digital workflows. Combining a 3D build of the original physical sets with the digital environment backgrounds developed for the VFX shots, we created an Unreal scene to illustrate the potential of LED walls. A big thanks to Pepe Valencia from Baraboom! for explaining how previz and techviz are used to inform the process, and to VFX Technologies for availing their stage and support to the project. It was great to see everyone come together to make it a success.

The growing adoption of USD is making this process more and more transparent, and any parts that are still not fully implemented can be addressed with python code. In this case, Relentless Play originally built the city environment for traditional offline rendering in Maya/Arnold. For this presentation, we then developed a process to convert the individual components, rebuild the maya hierarchy with a referencing pipeline, and transfer the material assignments all into a single USD stage that efficiently loaded into Unreal without any effort.

Relentless Play VFX Delivers The Little Render of Bethlehem





Relentless Play delivers VFX shots featuring ancient Bethlehem for Turning Point Productions’ “Why The Nativity?”







In the delivery of over 30 VFX shots, Thomas Hollier and Relentless Play LLC created digital environments for the cities of Bethlehem and Jerusalem in ancient times, and handled all aspects of VFX on the project from visual development and previsualization to on-set supervision and final shot production. From the start, the digital work proceeded in close collaboration with production designer Joe Cashman and the team at Design/Build Productions. The early integration of set CAD data and digital assets into a consistent digital environment staged onto the set location’s actual elevation data not only provided accurate shot previz but also served as a “process-agnostic” planning tool to tailor the physical production methods to well defined creative goals.




Driftwood

Many of the generative strategies that have been used in traditional VFX are now easily applied to electronic components. This integration provides for a much larger toolset where boundaries between physical and virtual becomes blurred, and the canvas becomes immersive rather than flat.

Beach wood, stepper motors, arduino, grout.

Custom Generative Art Display

What is the right way to display generative art?ย  After having developed a fairly substantial collection of algorithmic art pieces, running them in a browser or cluttered computer desktop left me wanting. In 2014, I set out to build this display device to showcase the work in a more direct way, without the distracting UI clutter and gear that accompanies the typical interaction with technology.

Electronic Time Piece with Cement and Light

Building a binary clock is a rite of passage for artists interested in exploring electronics. There is a subversive edge in over-designing it to the extent that its alleged primary purpose becomes obfuscated until all that is left is the contemplation of mysterious patterns. My version of this endeavor involves contrasting the immateriality of light with the weight and cold smoothness of polished cement. But it DOES tell time, although by the time you’ve decoded the bits, the time will be different.

The clock's circuit board was designed with an atmel MPU, bit-shift registers to address the individual LEDs, an RTC chip to keep accurate time, and a voltage regulator.