We've seen how powerful virtual production can be on big movie sets. We think it can be equally as powerful in smaller environments, especially when paired with motion control. We pulled all the pieces together to film a minimum-viable demo using live green screen keying using the Blackmagic Ultimatte 12 and a virtual world powered by Unreal Engine.
What makes virtual production different than a typical green screen shoot?
Virtual production means you're able to see the final result, or 'composite', live on set. Additionally, when the camera moves, you see proper parallax of the background. This is crucial to selling the effect and it's typically the biggest pain-point for Unreal Engine based productions: getting your live-action camera perfectly aligned with the virtual camera in your 3D software.
How do you match the live-action camera to the 3D camera?
Normally you'd put a sensor on top of the camera and tracking markers on the ceiling. As the camera moves, the sensor translates the position of the tracking markers and outputs that data for Unreal Engine to copy to the 3D camera. Most motion trackers in today's market have limitations with latency, accuracy, and position (you can't roll your camera too much or the sensor will lose the tracking markers).
How does motion control solve this?
Robots are designed to move between pre-defined X, Y, and Z coordinates; they are inherently programmed to know exactly where they are in real-world space at any given moment. So instead of sending Unreal Engine approximate data that is being translated and send with latency, robots can provide perfect, instantaneous coordinates that be copied in real-time by the 3D camera.
One downfall of our test workflow was that the camera we used only had HD-SDI out, so we could only output an HD signal to Ultimatte and then to Unreal Engine. While this lower resolution actually helps the computers process more quickly, a 4K+ video output is highly preferred for pulling quality green screen keys.
Here's how it turned out: