About The Project

RAVEN (Real-time Adaptive Virtual-Twin Environment for Next-Generation Robotics in Virtual Production) makes Unreal Engine 5 (UE5) the place where robotic crew are trained, validated, and deployed. The system embeds a live digital twin of the stage in UE5 and connects it to the Robot Operating System 2 (ROS 2), so a humanoid robot can perform camera operation (and later lighting/FX) with frame-accurate timing, low latency, and predictive safety.

Why now: LED/XR stages deliver real-time pixels, but the physical layer (camera placement, lighting pose, practical FX) still relies on manual rigging and rehearsals that slow iteration and add safety overhead. RAVEN closes this gap by giving UE5 direct influence over robot motion and stage devices, so directors/DPs can author, preview, and replay robotic camera moves entirely inside UE5, with visible safety zones and timing aligned to LED scanout and camera shutter (pose-at-display-time).

Key Features

Outcomes

Our Team

Contact

For any enquiries, please reach out to us at raven.modie@gmail.com.