The History of Virtual Production

Virtual production has been on everyone’s lips in the film industry for a couple of years now, but like all new technology it didn’t just appear overnight. Let’s trace the incremental steps that brought us to the likes of The Mandalorian and beyond.

The major component of virtual production – shooting actors against a large LED screen displaying distant or non-existent locations – has its roots in the front- and rear-projection common throughout much of the 20th century. This involved a film projector throwing pre-recorded footage onto a screen behind the talent. It was used for driving scenes in countless movies from North by Northwest to Terminator 2: Judgment Day, though by the time of the latter most filmmakers preferred blue screen.

Cary Grant films the crop duster scene from “North by Northwest”

The problem with blue and green screens is that they reflects those colours onto the talent. If the screen is blue and the inserted background is clear sky that might be acceptable, but in most cases it requires careful lighting and post-production processing to eliminate the blue or green spill.

Wanting to replace these troublesome reflections with authentic ones, DP Emmanuel Lubezki, ASC, AMC conceived an “LED Box” for 2013’s Gravity. This was a 20’ cube made of LED screens displaying CG interiors of the spacecraft or Earth slowly rotating beneath the characters. “We were projecting light onto the actors’ faces that could have darkness on one side, light on another, a hot spot in the middle and different colours,” Lubezki told American Cinematographer. “It was always complex.” Gravity’s screens were of a low resolution by today’s standards, certainly not good enough to pass as real backgrounds on camera, so the full-quality CGI had to be rotoscoped in afterwards, but the lighting on the cast was authentic. 

Sandra Bullock in “Gravity’s” LED box

Around the same time Netflix’s House of Cards was doing something similar for its driving scenes, surrounding the vehicle with chromakey green but rigging LED screens just out of frame. The screens showed pre-filmed background plates of streets moving past, which created realistic reflections in the car’s bodywork and nuanced, dynamic light on the actors’ faces.

Also released in 2013 was the post-apocalyptic sci-fi Oblivion. Many scenes took place in the Sky Tower, a glass-walled outpost above the clouds. The set was surrounded by 500×42’ of white muslin onto which cloud and sky plates shot from atop a volcano were front-projected. Usually, projected images are not bright enough to reflect useful light onto the foreground, but by layering up 21 projectors DP Claudio Miranda, ASC was able to achieve a T1.3-2.0 split at ISO 800. Unlike those of Gravity’s low-rez LED Box, the backgrounds were also good enough to not need replacing in post.

The set of “Oblivion” surrounded by front-projected sky backgrounds

It would take another few years for LED screens to reach that point.

By 2016 the technology was well established as a means of creating complex light sources. Deepwater Horizon, based on the true story of the Gulf of Mexico oil rig disaster, made use of a 42×24’ video wall comprising 252 LED panels. “Fire caused by burning oil is very red and has deep blacks,” DP Enrique Chediak, ASC explained to American Cinematographer, noting that propane fires generated by practical effects crews are more yellow. The solution was to light the cast with footage of genuine oil fires displayed on the LED screen.

Korean zombie movie Train to Busan used LED walls both for lighting and in-camera backgrounds zipping past the titular vehicle. Murder on the Orient Express would do the same the following year.

The hyperspace VFX displayed on a huge LED screen for “Rogue One”

Meanwhile, on the set of Rogue One, vehicles were travelling a little bit faster; a huge curved screen of WinVision Air panels (with a 9mm pixel pitch, again blocky by today’s standards) displayed a hyperspace effect around spacecraft, providing both interactive lighting and in-camera VFX so long as the screen was well out of focus. The DP was Greig Fraser, ACS, ASC, whose journey into virtual production was about to coincide with that of actor/director/producer Jon Favreau.

Favreau had used LED screens for interactive lighting on The Jungle Book, then for 2018’s The Lion King he employed a virtual camera system driven by the gaming engine Unity. When work began on The Mandalorian another gaming engine, Unreal, allowed a major breakthrough: real-time rendered, photo-realistic CG backgrounds. “It’s the closest thing to playing God that a DP can ever do,” Fraser remarked to British Cinematographer last year. “You can move the sun wherever you want.”

Since then we’ve seen LED volumes used prominently in productions like The Midnight Sky, The Batman and now Star Trek: Strange New Worlds, with many more using them for the odd scene here and there. Who knows what the next breakthrough might be?

The History of Virtual Production

The Pros and Cons of Master Shots

A master is a wide shot that covers all the action in a scene. The theory is that, should you run out of time or your lead actor suddenly gets injured or some other calamity prevents you shooting any coverage, at least you’ve captured the whole scene in a useable, if not ideal, form.

I have always been a fan of shooting masters. I remember once reading about a Hollywood film with a lot of puppets – it might have been Walter Murch’s 1985 Return to Oz – which fell seriously behind schedule. A producer or consultant was dispatched to the set to get things back on track, and concluded that part of the problem was a lack of masters. The director had been avoiding them because it was impossible to hide the puppeteers and rigging in wide shots, and instead was shooting scenes in smaller, tighter pieces. As a consequence, the cast and crew never saw the whole scene played out and struggled to understand how each piece fitted in, causing mistakes and necessitating time-consuming explanations.

For me, that’s the key benefit of masters: getting everyone on the same page so that the coverage goes faster.

A master shot of mine from “Forever Alone”, a student film I helped out on several years back 

You can dig yourself into holes if you don’t start with a wide. A small part of the set gets dressed and lit, a small part of the scene gets rehearsed, and then when you come to do the next part you realise it’s not going to fit together. A key prop that should have been in the background was forgotten because it wasn’t relevant to the first small piece; now you can’t put it in because you’ll break continuity. A light source that looked beautiful in that mid-shot is impossible to replicate in a later wide without seeing lamps or rigging. However much you might plan these things, inevitably in the heat of filming you get tunnel vision about the shot in front of you and everything else fades away. And it’s easy for a director, who has the whole film running on a cinema screen in their head, to forget that everyone else can’t see it as clearly.

Not starting with a wide also robs a DP of that vital, low-pressure time to light the whole set, getting all the sources in place that will be needed for the scene, so that re-lights for coverage can be quick and smooth. It also ties the editor’s hands somewhat if they haven’t got a wide shot to fall back on to get around problems.

So there are many benefits to masters. But lately I’ve been wondering if it’s dogmatic to say that they’re essential. I’ve worked with a few directors who have shot scenes in small, controlled pieces with great confidence and success.

Not shooting a master on “Harvey Greenfield is Running Late”. Photo: Mikey Kowalczyk

Last year I worked on a comedy that has a scene set at a school play, the main action taking place in the audience. Jonnie Howard, the director, was not interested in shooting a master of the hall showing the audience, the stage and the whole chunk of play that is performed during the action. All he wanted of the play was to capture certain, specific beats in mid-shots. He didn’t even know what was happening on stage the rest of the time. He knew exactly when he was going to cut to those shots, and more importantly that it would be funnier to only ever see those random moments. He also recognised that it was easier on the child actors to be given instructions for short takes, shot by shot, rather than having to learn a protacted performance.

Not shooting masters saved us valuable time on that film. It’s not the right approach for every project; it depends on the director, how well they’re able to visualise the edit, and how much flexibility they want the editor to have. It depends on the actors too; some are more able to break things down into small pieces without getting lost, while others always like to have the run-up of “going from the top”.

There is a halfway house, which is to rehearse the whole scene, but not to shoot it. This requires clear communication with the 1st AD, however, or you’ll find that certain actors who aren’t in the first shot are still tied up in make-up when you want to rehearse. Like any way of working, it’s always best to be clear about it with your key collaborators up front, so that the pros can be maximised, the cons can be minimised, and everyone does their best work most efficiently.

A rare master shot from “Heretiks”
The Pros and Cons of Master Shots