The worlds of filmmaking and video game technology are colliding. Just as Hideo Kojima uses film as base inspiration for games like Death Stranding, Jon Favreau turned to video game technology in the filming of The Mandalorian. The Disney+ series uses what’s constantly been referred to as “groundbreaking new technology” for the film industry, Something ILM calls “StageCraft.” Using Epic’s Unreal Engine, the filmmakers render a scene in realtime and then project it onto massive LED screens that encompass the acting space. These scenes are then tracked with pixel-perfect accuracy to the camera, keeping perspective and parallax just right no matter what movements are made.
There are a ton of advantages to this method of filming locations and backplates for scenes. Instead of needing to scout locations or build complex sets, filmmakers can just create the perfect location themselves and have them ready for filming within 24 hours. If adjustments need to be made, such as the positioning of certain elements, coloration, or lighting, they can make those changes on the fly. Time of day also isn’t an issue. The production team described the ability to shoot a 10-hour day at dawn because they can simply set the scene when they want it. No more waiting for perfect weather or lighting.
For the actors, it’s a much more immersive way of filming. The Mandalorian’s creators have talked about almost forgetting that they were on a set when in the middle of a scene. And instead of needing green screens to make post-production edits to the environment, everything in the scene is already captured at the time of filming. This also means that lighting, reflections, and color are more accurate, requiring less post-production and processing to fix those issues, and giving the creators more freedom of choice to make creative effects decisions while filming rather than in the editing process.
Using Unreal Engine and LED screens is a perfect solution for faster TV show production, allowing shows like The Mandalorian to have big-budget sets and locations and get everything filmed and edited faster. The production team continued to expand on the idea throughout The Mandalorian’s first season, even bringing physical set pieces right into the LED screen stage. This added elements of dynamic lighting and reflections off of the physical objects, more physical interaction to the scenes, and allowed them to further blur the lines between what was digitally created and what was actually there on the set.
You can get a look at the whole process in action in the short behind-the-scenes video below. Note in the thumbnail image how a portion of the ship is a physical set piece, while the upper portion and engines are simply screen projections. Because the scene is rendered in game engine in a 3D space, they can map the camera to that space to make sure the perspective and angle stays accurate while filming.
More than 50% of The Mandalorian was filmed using this technique. The set is a “20’ high by 270-degree semicircular LED video wall and ceiling with a 75’-diameter performance space.” The digital set pieces were created by ILM (Industrial Light and Magic) using Unreal Engine for real-time interactivity. The high-quality perspective-accurate rendering of the 3D imagery was done using NVIDIA GPUs (on four synced PCs, according to Epic).
As many comments on the YouTube video point out, ILM (along with partners at Epic, Golem Creations, Fuse, Lux Machina, Profile Studios, and ARRI) have essentially created Star Trek’s own holodeck, ironically first used to film a Star Wars show. The massive LED screen set essentially becomes a giant VR headset that the actors and filmmakers create the show within. We’ll undoubtedly see this technology evolve and gain wider adoption in television and movie production in the future, and perhaps game developers will even adopt and continue to expand on some of its ideas for themselves using what Epic learned during The Mandalorian’s production.