“Jurassic Park” Retrospective

With the temporary closure of Cineworlds around the UK, the future of theatrical exhibition once more hangs in the balance. But just a couple of months ago cinemas were reopening and people were positive that the industry would recover. One of the classic blockbusters that was re-released to plug the gaps in the release schedule ahead of Christopher Nolan’s Tenet was a certain quite popular film about dinosaurs. I described my trip to see it recently, but let’s put that hideous experience behind us and concentrate on the film itself.

Thanks in no small part to the excellent “making of” book by Don Shay and Jody Duncan, Jurassic Park was a formative experience for the 13-year-old Neil Oseman, setting me irrevocably on the path to filmmaking as a career. So let me take you back in time and behind the scenes of an iconic piece of popcorn fodder.

 

Man creates dinosaurs

Even before author Michael Crichton delivered the manuscript of his new novel in May 1990, Steven Spielberg had expressed an interest in adapting it. A brief bidding war between studios saw Joe Dante (Gremlins), Tim Burton (Batman) and Richard Donner (Superman) in the frame to direct, but Spielberg and Universal Pictures were the victors.

Storyboards by David Lowery. Lots of the film’s storyboards are reproduced in “The Making of Jurassic Park” by Don Shay and Jody Duncan.

The screenplay went through several drafts, first by Crichton himself, then by Malio Scotch Marmo and finally by David Koepp, who would go on to script Mission: Impossible, Spider-Man and Panic Room. Pre-production began long before Koepp finished writing, with Spielberg generating storyboards based directly on scenes from the book so that his team could figure out how they were going to bring the dinosaurs to life.

Inspired by a life-size theme park animatronic of King Kong, Spielberg initially wanted all the dinsoaurs to be full-scale physical creatures throughout. This was quickly recognised as impractical, and instead Stan Winston Studio, creators of the Terminator endoskeleton, the Predator make-up and the fifteen-foot-tall Alien queen, focused on building full-scale hydraulically-actuated dinosaurs that would serve primarily for close-ups and mids.

Stan Winston’s crew with their hydraulic behemoth

Meanwhile, to accomplish the wider shots, Spielberg hired veteran stop-motion animator Phil Tippett, whose prior work included ED-209 in RoboCop, the tauntaun and AT-AT walkers in The Empire Strikes Back, and perhaps most relevantly, the titular creature from Dragonslayer. After producing some beautiful animatics – to give the crew a clearer previsualisation of the action than storyboards could provide – Tippett shot test footage of the “go-motion” process he intended to employ for the real scenes. Whilst this footage greatly improved on traditional stop-motion by incorporating motion blur, it failed to convince Spielberg.

At this point, Dennis Muren of Industrial Light and Magic stepped in. Muren was the visual effects supervisor behind the most significant milestones in computer-generated imagery up to that point: the stained-glass knight in Young Sherlock Holmes (1986), the water tendril in The Abyss (1989) and the liquid metal T-1000 in Terminator 2: Judgment Day (1991). When Spielberg saw his test footage – initially just skeletons running in a black void – the fluidity of the movement immediately grabbed the director’s attention. Further tests, culminating in a fully-skinned tyrannosaur stalking a herd of gallimimuses, had Spielberg completely convinced. On seeing the tests himself, Tippett famously quipped: “I think I’m extinct.”

The first CGI test

Tippett continued to work on Jurassic Park, however, ultimately earning a credit as dinosaur supervisor. Manipulating a custom-built armature named the Dinosaur Input Device, Tippett and his team were able to have their hands-on techniques recorded by computer and used to drive the CG models.

Building on his experiences working with the E.T. puppet, Spielberg pushed for realistic animal behaviours, visible breathing, and bird-like movements reflecting the latest paleontological theories, all of which would lend credibility to the dinosaurs. Effects co-supervisor Mark Dippe stated: “We used to go outdoors and run around and pretend we were gallimisuses or T-Rexes hunting each other, and shoot [reference] film.”

 

Dinosaurs eat man

Stan Winston’s triceratops was the first dinosaur to go before the cameras, and the only one to be filmed on location.

Production began in August 1992 with three weeks on the Hawaiian island of Kauai. Filming progressed smoothly until the final day on location, which had to be scrubbed due to Hurrican Iniki (although shots of the storm made it into the finished film). After a brief stint in the Mojave Desert, the crew settled into the stages at Universal Studios and Warner Brothers to record the bulk of the picture.

The most challenging sequence to film would also prove to be the movie’s most memorable: the T-Rex attack on the jeeps containing Sam Neill’s Dr. Grant, Jeff Goldblum’s Ian Malcolm, lawyer Gennaro and the children, Lex and Tim. It was the ultimate test for Stan Winston’s full-scale dinosaurs.

The T-Rex mounted on its motion simulator base on Stage 16 at Warner Brothers

The main T-Rex puppet weighed over six tonnes and was mounted on a flight simulator-style platform that had to be anchored into the bedrock under the soundstage. Although its actions were occasionally pre-programmed, the animal was mostly puppeteered live using something similar to the Dinosaur Input Device.

But the torrential rain in which the scene takes place was anathema to the finely tuned mechanics and electronics of the tyrannosaur. “As [the T-Rex] would get rained on,” Winston explained, “his skin would soak up water, his weight would change, and in the middle of the day he would start having the shakes and we would have to dry him down.”

Although hints of this shaking can be detected by an eagle-eyed viewer, the thrilling impact of the overall sequence was clear to Spielberg, who recognised that the T-Rex was the star of his picture. He hastily rewrote the ending to bring the mighty creature back, relying entirely on CGI for the new climax in which it battles raptors in the visitor centre’s rotunda.

The CGI T-Rex in the rewritten finale

 

Woman inherits the earth

After wrapping 12 days ahead of schedule, Jurassic Park hit US cinemas on June 11th, 1993. It became the highest-grossing film of all time, a title which it would hold until Titanic’s release four years later. 1994’s Oscar ceremony saw the prehistoric blockbuster awarded not only Best Visual Effects but also Best Sound Editing and Best Sound Mixing. Indeed, Gary Rydstrom’s contribution to the film – using everything from a dolphin/walrus combination for the raptors’ calls, to the sound of his own dog playing with a rope toy for the T-Rex – cannot be overstated.

Jurassic Park has spawned four sequels to date (with a fifth on the way), and its impact on visual effects was enormous. For many years afterwards, blockbusters were filled with CGI that was unable to equal, let along surpass, the quality of Jurassic Park’s. Watching it today, the CGI is still impressive if a little plasticky in texture, but I believe that the full-size animatronics which form the lion’s share of the dinosaurs’ screen time are what truly give the creatures their memorable verisimilitude. The film may be 27 years old, but it’s still every bit as entertaining as it was in 1993.

This article first appeared on RedShark News.

Director of photography Dean Cundey, ASC with the brachiosaur head puppet
“Jurassic Park” Retrospective

Will “The Mandalorian” Revolutionise Filmmaking?

Last week, Greig Fraser, ASC, ACS and Baz Idoine were awarded the Emmy for Outstanding Cinematography for a Single-camera Series (Half-hour) for The Mandalorian. I haven’t yet seen this Star Wars TV series, but I’ve heard and read plenty about it, and to call it a revolution in filmmaking is not hyperbole.

Half of the series was not shot on location or on sets, but on something called a volume: a stage with walls and ceiling made of LED screens, 20ft tall, 75ft across and encompassing 270° of the space. I’ve written before about using large video screens to provide backgrounds in limited aways, outside of train windows for example, and using them as sources of interactive light, but the volume takes things to a whole new level.

In the past, the drawback of the technology has been one of perspective; it’s a flat, two-dimensional screen. Any camera movement revealed this immediately, because of the lack of parallax. So these screens tended to be kept to the deep background, with limited camera movement, or with plenty of real people and objects in the foreground to draw the eye. The footage shown on the screens was pre-filmed or pre-rendered, just video files being played back.

The Mandalorian‘s system, run by multiple computers simultaneously, is much cleverer. Rather than a video clip, everything is rendered in real time from a pre-built 3D environment known as a load, running on software developed for the gaming industry called Unreal Engine. Around the stage are a number of witness cameras which use infra-red to monitor the movements of the cinema camera in the same way that an actor is performance-captured for a film like Avatar. The data is fed into Unreal Engine, which generates the correct shifts in perspective and sends them to the video walls in real time. The result is that the flat screen appears, from the cinema camera’s point of view, to have all the depth and distance required for the scene.

The loads are created by CG arists working to the production designer’s instructions, and textured with photographs taken at real locations around the world. In at least one case, a miniature set was built by the art department and then digitised. The scene is lit with virtual lights by the DP – all this still during preproduction.

The volume’s 270° of screens, plus two supplementary, moveable screens in the 90° gap behind camera, are big enough and bright enough that they provide most or all of the illumination required to film under. The advantages are obvious. “We can create a perfect environment where you have two minutes to sunset frozen in time for an entire ten-hour day,” Idoine explains. “If we need to do a turnaround, we merely rotate the sky and background, and we’re ready to shoot!”

Traditional lighting fixtures are used minimally on the volume, usually for hard light, which the omni-directional pixels of an LED screen can never reproduce. If the DPs require soft sources beyond what is built into the load, the technicians can turn any off-camera part of the video screens into an area of whatever colour and brightness are required –  a virtual white poly-board or black solid, for example.

A key reason for choosing the volume technology was the reflective nature of the eponymous Mandalorian’s armour. Had the series been shot on a green-screen, reflections in his shiny helmet would have been a nightmare for the compositing team. The volume is also much more actor- and filmmaker-friendly; it’s better for everyone when you can capture things in-camera, rather than trying to imagine what they will look like after postproduction. “It gives the control of cinematography back to the cinematographer,” Idoine remarks. VR headsets mean that he and the director can even do a virtual recce.

The Mandalorian shoots on the Arri Alexa LF (large format), giving a shallow depth of field which helps to avoid moiré problems with the video wall. To ensure accurate chromatic reproduction, the wall was calibrated to the Alexa LF’s colour filter array.

Although the whole system was expensive to set up, once up and running it’s easy to imagine how quickly and cheaply the filmmakers can shoot on any given “set”. The volume has limitations, of course. If the cast need to get within a few feet of a wall, for example, or walk through a door, then that set-piece has to be real. If a scene calls for a lot of direct sunlight, then the crew move outside to the studio backlot. But undoubtedly this technology will improve rapidly, so that it won’t be long before we see films and TV episodes shot entirely on volumes. Perhaps one day it could overtake traditional production methods?

For much more detailed information on shooting The Mandalorian, see this American Cinematographer article.

Will “The Mandalorian” Revolutionise Filmmaking?

10 Clever Camera Tricks in “Aliens”

In 1983, up-and-coming director James Cameron was hired to script a sequel to Ridley Scott’s 1979 hit Alien. He had to pause halfway through to shoot The Terminator, but the subsequent success of that movie, along with the eventually completed Aliens screenplay, so impressed the powers that be at Fox that they greenlit the film with the relatively inexperienced 31-year-old at the helm.

Although the sequel was awarded a budget of $18.5 million – $7.5 million more than Scott’s original – that was still tight given the much more expansive and ambitious nature of Cameron’s script. Consequently, the director and his team had to come up with some clever tricks to put their vision on celluloid.

 

1. Mirror Image

When contact is lost with the Hadley’s Hope colony on LV-426, Ripley (Sigourney Weaver) is hired as a sort of alien-consultant to a team of crack marines. The hypersleep capsules from which the team emerge on reaching the planet were expensive to build. Production designer Peter Lamont’s solution was to make just half of them, and place a mirror at the end of the set to double them up.

 

2. Small Screens

Wide shots of Hadley’s Hope were accomplished with fifth-scale miniatures by Robert and Dennis Skotak of 4-Ward Productions. Although impressive, sprawling across two Pinewood stages, the models didn’t always convince. To help, the crew often downgraded the images by showing them on TV monitors, complete with analogue glitching, or by shooting through practical smoke and rain.

 

3. Big Screens

The filmmakers opted for rear projection to show views out of cockpit windscreens and colony windows. This worked out cheaper than blue-screen composites, and allowed for dirt and condensation on the glass, which would have been impossible to key optically. Rear projection was also employed for the crash of the dropship – the marines’ getaway vehicle – permitting camera dynamics that again were not possible with compositing technology of the time.

 

4. Back to Front

A highlight of Aliens is the terrifying scene in which Ripley and her young charge Newt (Carrie Henn) are trapped in a room with two facehuggers, deliberately set loose by sinister Company man Carter Burke (Paul Reiser). These nightmarish spider-hands were primarily puppets trailing cables to their operators. To portray them leaping onto a chair and then towards camera, a floppy facehugger was placed in its final position and then tugged to the floor with a fishing wire. The film was reversed to create the illusion of a jump.

 

5. Upside Down

Like Scott before him, Cameron was careful to obfuscate the man-in-a-suit nature of the alien drones wherever possible. One technique he used was to film the creatures crawling on the floor, with the camera upside-down so that they appeared to be hanging from the ceiling. This is seen when Michael Biehn’s Hicks peeks through the false ceiling to find out how the motion-tracked aliens can be “inside the room”.

 

6. Flash Frames

All hell (represented by stark red emergency lighting) breaks loose when the aliens drop through the false ceiling. To punch up the visual impact of the movie’s futuristic weapons, strobelights were aimed at the trigger-happy marines. Taking this effect even further, editor Ray Lovejoy spliced individual frames of white leader film into the shots. As a result, the negative cutter remarked that Aliens‘ 12th reel had more cuts than any complete movie he’d ever worked on.

 

7. Cotton Cloud

With most of the marines slaughtered, Ripley heads to the atmospheric processing plant to rescue Newt from the alien nest. Aided by the android Bishop (Lance Henriksen) they escape just before the plant’s nuclear reactor explodes. The ensuing mushroom cloud is a miniature sculpture made of cotton wool and fibreglass, illuminated by an internal lightbulb!

 

8. Hole in the floor

Returning to the orbiting Sulaco, Ripley and friends are ambushed by the stowaway queen, who rips Bishop in half. A pre-split, spring-loaded dummy of Henriksen was constructed for that moment, and was followed by the simple trick of concealing the actor’s legs beneath a hole in the floor. As in the first movie, android blood was represented by milk. This gradually soured as the filming progressed, much to Henriksen’s chagrin as the script required him to be coated in the stuff and even to spit it out of his mouth.

 

9. Big Battle

The alien queen was constructed and operated by Stan Winston Studios as a full-scale puppet. Two puppeteers were concealed inside, while others moved the legs with rods or controlled the crane from which the body hung. The iconic power loader was similar, with a body builder concealed inside and a counter-weighted support rig. This being before the advent of digital wire removal, all the cables and rods had to be obfuscated with smoke and shifting shadows, though they can still be seen on frame grabs like this one. (The queen is one of my Ten Greatest Movie Puppets of All Time.)

 

10. Little Battle

For wide shots of the final fight, both the queen and the power loader were duplicated as quarter scale puppets. Controlled from beneath the miniature set via rods and cables, the puppets could perform big movements, like falling into the airlock, which would have been very difficult with the full-size props. (When the airlock door opens, the starfield beyond is a black sheet with Christmas lights on it!) The two scales cut seamlessly together and produce a thrilling finale to this classic film.

For more on the visual effects of James Cameron movies, see my rundown of the top five low-tech effects in Hollywood films (featuring Titanic) and a breakdown of the submarine chase in The Abyss.

10 Clever Camera Tricks in “Aliens”

The Long Lenses of the 90s

Lately, having run out of interesting series, I’ve found myself watching a lot of nineties blockbusters: Outbreak, Twister, Dante’s Peak, Backdraft, Daylight. Whilst eighties movies were the background to my childhood, and will always have a place in my heart, it was the cinema of the nineties that I was immersed in as I began my own amateur filmmaking. So, looking back on those movies now, while certain clichés stand out like sore thumbs, they still feel to me like solid examples of how to make a summer crowd-pleaser.

Let’s get those clichés out of the way first. The lead character always has a failed marriage. There’s usually an opening scene in which they witness the death of a spouse or close relative, before the legend “X years later” fades up. The dog will be saved, but the crotchety elderly character will die nobly. Buildings instantly explode towards camera when touched by lava, hurricanes, floods or fires. A stubborn senior authority figure will refuse to listen to the disgraced lead character who will ultimately be proven correct, to no-one’s surprise.

Practical effects in action on “Twister”

There’s an intensity to nineties action scenes, born of the largely practical approach to creating them. The decade was punctuated by historic advances in digital effects: the liquid metal T-1000 in Terminator 2 (1991), digital dinosaurs in Jurassic Park (1993), motion-captured passengers aboard the miniature Titanic (1997), Bullet Time in The Matrix (1999). Yet these techniques remained expensive and time-consuming, and could not match traditional methods of creating explosions, floods, fire or debris. The result was that the characters in jeopardy were generally surrounded by real set-pieces and practical effects, a far more nerve-wracking experience for the viewer than today, when we can tell that our heroes are merely imagining their peril on a green-screen stage.

One thing I was looking out for during these movie meanders down memory lane was lens selection. A few weeks back, a director friend had asked me to suggest examples of films that preferred long lenses. He had mentioned that such lenses were more in vogue in the nineties, which I’d never thought about before.

As soon as I started to consider it, I realised how right my friend was. And how much that long-lens looked had influenced me. When I started out making films, I was working with the tiny sensors of Mini-DV cameras. I would often try to make my shots look more cinematic by shooting on the long end of the zoom. This was partly to reduce the depth of field, but also because I instinctively felt that the compressed perspective was more in keeping with what I saw at the cinema.

I remember being surprised by something that James Cameron said in his commentary on the Aliens DVD:

I went to school on Ridley [Scott]’s style of photography, which was actually quite a bit different from mine, because he used a lot of long lenses, much more so than I was used to working with.

I had assumed that Cameron used long lenses too, because I felt his films looked incredibly cinematic, and because I was so sure that cinematic meant telephoto. I’ve discussed in the past what I think people tend to mean by the term “cinematic”, and there’s hardly a definitive answer, but I’m now sure that lens length has little to do with it.

“Above the Clouds” (dir. Leon Chambers)

And yet… are those nineties films influencing me still? I have to confess, I struggle with short lenses to this day. I find it hard to make wide-angle shots look as good. On Above the Clouds, to take just one example, I frequently found that I preferred the wide shots on a 32mm than a 24mm. Director Leon Chambers agreed; perhaps those same films influenced him?

A deleted scene from Ren: The Girl with the Mark ends with some great close-ups shot on my old Sigma 105mm still lens, complete with the slight wobble of wind buffeting the camera, which to my mind only adds to the cinematic look! On a more recent project, War of the Worlds: The Attack, I definitely got a kick from scenes where we shot the heroes walking towards us down the middle of the street on a 135mm.

Apart from the nice bokeh, what does a long lens do for an image? I’ve already mentioned that it compresses perspective, and because this is such a different look to human vision, it arguably provides a pleasing unreality. You could describe it as doing for the image spatially what the flicker of 24fps (versus high frame rates) does for it temporally. Perhaps I shy away from short lenses because they look too much like real life, they’re too unforgiving, like many people find 48fps to be.

The compression applies to people’s faces too. Dustin Hoffman is not known for his small nose, yet it appears positively petite in the close-up below from Outbreak. While this look flatters many actors, others benefit from the rounding of their features caused by a shorter lens.

Perhaps the chief reason to be cautious of long lenses is that they necessitate placing the camera further from the action, and the viewer will sense this, if only on a subconscious level. A long lens, if misused, can rob a scene of intimacy, and if overused could even cause the viewer to disengage with the characters and story.

I’ll leave you with some examples of long-lens shots from the nineties classics I mentioned at the start of this post. Make no mistake, these films employed shorter lenses too, but it certainly looks to me like they used longer lenses on average than contemporary movies.

 

Outbreak

DP: Michael Ballhaus, ASC

 

Twister

DP: Jack N. Green, ASC

 

Daylight

DP: David Eggby, ACS

 

Dante’s Peak

DP: Andrzej Bartkowiak, ASC

 

Backdraft

DP: Mikael Salomon, ASC

For more on this topic, see my article about “The Normal Lens”.

The Long Lenses of the 90s

Why You Can’t Re-light Footage in Post

The concept of “re-lighting in post” is one that has enjoyed a popularity amongst some no-budget filmmakers, and which sometimes gets bandied around on much bigger sets as well. If there isn’t the time, the money or perhaps simply the will to light a scene well on the day, the flexibility of RAW recording and the power of modern grading software mean that the lighting can be completely changed in postproduction, so the idea goes.

I can understand why it’s attractive. Lighting equipment can be expensive, and setting it up and finessing it is one of the biggest consumers of time on any set. The time of a single wizard colourist can seem appealingly cost-effective – especially on an unpaid, no-budget production! – compared with the money pit that is a crew, cast, location, catering, etc, etc. Delaying the pain until a little further down the line can seem like a no-brainer.

There’s just one problem: re-lighting footage is fundamentally impossible. To even talk about “re-lighting” footage demonstrates a complete misunderstanding of what photographing a film actually is.

This video, captured at a trillion frames per second, shows the tranmission and reflection of light.

The word “photography” comes from Greek, meaning “drawing with light”. This is not just an excuse for pompous DPs to compare themselves with the great artists of the past as they “paint with light”; it is a concise explanation of what a camera does.

A camera can’t record a face. It can’t record a room, or a landscape, or an animal, or objects of any kind. The only thing a camera can record is light. All photographs and videos are patterns of light which the viewer’s brain reverse-engineers into a three-dimensional scene, just as our brains reverse-engineer the patterns of light on the retinae every moment of every day, to make sense of our surroundings.

The light from this object gets gradually brighter then gradually darker again – therefore it is a curved surface. There is light on the top of that nose but not on the underneath, so it must be sticking out. These oval surfaces are absorbing all the red and blue light and reflecting only green, so it must be plant life. Such are the deductions made continuously by the brain’s visual centre.

A compound lens for a prototype light-field camera by Adobe

To suggest that footage can be re-lit is to suggest that recorded light can somehow be separated from the underlying physical objects off which that light reflected. Now of course that is within the realms of today’s technology; you could analyse a filmed scene and build a virtual 3D model of it to match the footage. Then you could “re-light” this recreated scene, but it would be a hell of a lot of work and would, at best, occupy the Uncanny Valley.

Some day, perhaps some day quite soon, artificial intelligence will be clever enough to do this for us. Feed in a 2D video and the computer will analyse the parallax and light shading to build a moving 3D model to match it, allowing a complete change of lighting and indeed composition.

Volumetric capture is already a functioning technology, currently using a mix of infrared and visible-light cameras in an environment lit as flatly as possible for maximum information – like log footage pushed to its inevitable conclusion. By surrounding the subject with cameras, a moving 3D image results.

Sir David Attenborough getting his volume captured by Microsoft

Such rigs are a type of light-field imaging, a technology that reared its head a few years ago in the form of Lytro, with viral videos showing how depth of field and even camera angle (to a limited extent) could be altered with this seemingly magical system. But even Lytro was capturing light, albeit it in a way that allowed for much more digital manipulation.

Perhaps movies will eventually be captured with some kind of Radar-type technology, bouncing electromagnetic waves outside the visible spectrum off the sets and actors to build a moving 3D model. At that point the need for light will have been completely eliminated from the production process, and the job of the director of photography will be purely a postproduction one.

While I suspect most DPs would prefer to be on a physical set than hunched over a computer, we would certainly make the transition if that was the only way to retain meaningful authorship of the image. After all, most of us are already keen to attend grading sessions to ensure our vision survives postproduction.

The Lytro Illum 2015 CP+ by Morio – own work, CC BY-SA 3.0, https://commons.wikimedia.org/w/index.php?curid=38422894

But for the moment at least, lighting must be done on set; re-lighting after the fact is just not possible in any practical way. This is not to take away from the amazing things that a skilled colourist can do, but the vignettes, the split-toning, the power windows, the masking and the tracking – these are adjustments of emphasis.

A soft shadow can be added, but without 3D modelling it can never fall and move as a real shadow would. A face can be brightened, but the quality of light falling on it can’t be changed from soft to hard. The angle of that light can’t be altered. Cinematographers refer to a key-light as the “modelling” light for a reason: because it defines the 3D model which your brain reverse-engineers when it sees the image.

So if you’re ever tempted to leave the job of lighting to postproduction, remember that your footage is literally made of light. If you don’t take the time to get your lighting right, you might as well not have any footage at all.

Why You Can’t Re-light Footage in Post

The Cinematography of “First Man”

A miniature Saturn V rocket is prepared for filming

If you’re a DP, you’re probably familiar with the “Guess the Format” game. Whenever you see a movie, you find yourself trying to guess what format it was shot on. Film or digital? Camera? Glass? Resolution?

As I sat in the cinema last autumn watching First Man, I was definitely playing the game. First Man tells the true story of Neil Armstrong’s (Ryan Gosling) extraterrestrial career, including his test flights in the hypersonic  X-15, his execution of the first ever docking in space aboard Gemini 8, the tragic deaths of his colleagues in the launchpad fire of Apollo 1, and of course the historic Apollo 11.

The game was given away fairly early on when I noticed frames with dust on, a sure sign of celluloid acquisition. (Though most movies have so much digital clean-up now that a lack of dust doesn’t necessarily mean that film wasn’t involved.) I automatically assumed 35mm, though as the film went on I occasionally wondered if I could possibly be watching Super-16? There was something of the analogue home movie about certain scenes, the way the searing highlights of the sun blasting into the space capsules rolled off and bloomed.

When I got home I tracked down this Studio Daily podcast and my suspicions were confirmed, but we’ll get to that in a minute.

 

Cinéma Vérité

Let’s start at the beginning. First Man was directed by Damien Chazelle and photographed by Linus Sandgren, FSF, the same team who made La La Land, for which both men won Oscars. What I remember most about the cinematography of that earlier film is the palette of bright but slightly sickly colours, and the choreographed Steadicam moves.

First Man couldn’t be more different, adopting a cinéma vérité approach that often looks like it could be real and previously-unseen Nasa footage. Sandgren used zoom lenses and a documentary approach to achieve this feeling:

When you do a documentary about a person and you’re there in their house with them and they’re sad or they’re talking, maybe you don’t walk in there and stand in the perfect camera position. You can’t really get the perfect angles. That in itself creates some sort of humbleness to the characters; you are a little respectful and leave them a little alone to watch them from a distance or a little bit from behind.

Similarly, scenes in the spacecraft relied heavily on POVs through the small windows of the capsule, which is all that the astronauts or a hypothetical documentary camera operator would have been able to see. This blinkered view, combined with evocative and terrifying sound design – all metallic creaks, clanks and deafening booms, like the world itself is ending – makes the spaceflight sequences incredibly visceral.

 

Multiple gauges

Scale comparison of film formats. Note that Imax is originated on 65mm stock and printed on 70mm to allow room for the soundtrack.

Documentaries in the sixties would have been shot on Super-16, which is part of the reason that Sandgren and Chazelle chose it as one of their acquisition formats. The full breakdown of formats is as follows:

  • Super-16 was employed for intense or emotional material, specifically early sequences relating to the death of Armstrong’s young daughter, and scenes inside the various spacecraft. As well as the creative considerations, the smaller size of Super-16 equipment was presumably advantageous from a practical point of view inside the cramped sets.
  • 35mm was used for most of the non-space scenes. Sandgren differentiated the scenes at Nasa from those at Armstrong’s home by push-processing the former and pull-processing the latter. What this means is that Nasa scenes were underexposed by one stop and overdeveloped, resulting in a detailed, contrasty, grainy look, while the home scenes were overexposed and underdeveloped to produce a cleaner, softer, milkier look. 35mm was also used for wide shots in scenes that were primarily Super-16, to ensure sufficient definition.
  • Imax (horizontally-fed 65mm) was reserved for scenes on the moon.

 

In-camera effects

In keeping with the vintage aesthetic of celluloid capture, the visual effects were captured in-camera wherever possible. I’ve written in the past about the rise of LED screens as a replacement for green-screen and a source of interactive lighting. I guessed that First Man was using this technology from ECUs which showed the crescent of Earth reflected in Ryan Gosling’s eyes. Such things can be added in post, of course, but First Man‘s VFX have the unmistakeable ring of in-camera authenticity.

Imposing a “no green-screen” rule, Chazelle and his team used a huge LED screen to display the views out of the spacecraft windows. A 180° arc of 60′ diameter and 35′ in height, this screen was bright enough to provide all the interactive lighting that Sandgren required. His only addition was a 5K tungsten par or 18K HMI on a crane arm to represent the direct light of the sun.

The old-school approach extended to building and filming miniatures, of the Saturn V rocket and its launch tower for example. For a sequence of Armstrong in an elevator ascending the tower, the LED screen behind Gosling displayed footage of this miniature.

For external views of the capsules in space, the filmmakers tried to limit themselves to realistic shots which a camera mounted on the bodywork might have been able to capture. This put me in mind of Christopher Nolan’s Interstellar, which used the same technique to sell the verisimilitude of its space vehicles. In an age when any conceivable camera move can be executed, it can be very powerful to stick to simple angles which tap into decades of history – not just from cinema but from documentaries and motorsports coverage too.

 

Lunar Lighting

For scenes on earth, Landgren walked a line between naturalism and expression, influenced by legendary DPs like Gordon Willis, ASC. My favourite shot is a wide of Armstrong’s street at night, as he and his ill-fated friend Ed White (Jason Clarke) part company after a drinking session. The mundane suburban setting is bathed in blue moonbeams, as if the the moon’s fingers are reaching out to draw the characters in.

Scenes on the lunar surface were captured at night on an outdoor set the size of three football pitches. To achieve absolute authenticity, Sandgren needed a single light source (representing the sun) fixed at 15° above the horizon. Covering an area that size was going to require one hell of a single source, so he went to Luminys, makers of the Softsun.

Softsuns

Softsuns are lamps of frankly ridiculous power. The 50KW model was used, amongst other things, to blast majestic streams of light through the windows of Buckingham Palace on The Crown, but Sandgren turned to the 100KW model. Even that proved insufficient, so he challenged Luminys to build a 200KW model, which they did.

The result is a completely stark and realistic depiction of a place where the sun is the only illumination, with no atmosphere to diffuse or redistribute it, no sky to glow and fill in the shadows. This ties in neatly with a prevailing theme in the film, that of associating black with death, when Armstrong symbolically casts his deceased daughter’s bracelet into an obsidian crater.

First Man may prove unsatisfying for some, with Armstrong’s taciturn and emotionally closed-off nature making his motivations unclear, but cinematically it is a tour de force. Taking a human perspective on extraordinary accomplishments, deftly blending utterly convincing VFX and immersive cinéma vérité photography, First Man recalls the similarly analogue and similarly gripping Dunkirk as well as the documentary-like approach of 1983’s The Right Stuff. The film is currently available on DVD, Blu-ray and VOD, and I highly recommend you check it out.

The Cinematography of “First Man”

“The Little Mermaid”: Boats, Trains and Automobiles

One of the biggest challenges on The Little Mermaid was the amount of material set in moving vehicles at night. Over the course of the story, the heroes travel in two different trains, a pick-up truck and a riverboat, and I knew that lighting large stretches of railway, road or river wasn’t going to be practical on our budget. Ultimately much of it ended up being done against green screen, with the notable exception of the riverboat, the first mode of transport to go before the cameras. Here are the relevant extracts from my diary.

 

Day 14

Today’s a big day because we’re shooting on a riverboat which has been hired at great expense. We have a huge amount of material to cover and there’s no way we can come back to the boat later if we don’t get it all. Chris and I make a game plan in the afternoon and arrive at the dock in good time.

It feels a lot like a micro-budget movie, shooting on a location that perhaps should have been a set (once we set sail you can’t see anything in the background because it’s night) with a tiny lighting package running off a little genny: some Kinos, two LED panels, and a 1K baby. Out there in the dark river, it is eery watching unfathomably huge container ships pass 50ft from us. We leave ‘B’ camera on the shore and try to co-ordinate with them by walkie as they shoot wide shots of the boat and we try to hide!

 

Day 16

Night driving scenes in a pick-up truck today. Poor Man’s Process was considered for these, then doing it for real with a low loader (called a process trailer here in the States). But at last green screen was chosen as the way to go.

The period vehicle is wheeled into our studio and parked in front of two 12×12 green screens, which VFX supervisor Rich dots with red tape crosses for tracking markers. Throughout the night he moves them around to make sure there are always a couple in shot. We light the green screen with two Image 80s (4ft 8-bank Kino Flos with integral ballasts) fitted with special chroma green tubes. Rich tells me to expose the screen at key, which in this case is T4.

Captain Dan Xeller, best boy electric, has lit car stuff before, so I give him free reign to establish the ambient level. He does it with 1Ks fired into 8×4 bounce boards, so that any reflections in the car’s bodywork will be large and sky-like, not strips like Kino Flos or points like pars or fresnels.

For shape we add a 5K with a chimera at a three-quarter angle, and a side-on par can with a “branch-a-loris” in front of it. Key grip Jason Batey designs this rig, consisting of two branches on a pivot like a Catherine Wheel, which can be spun at any speed by one of the grips, to simulate movement of the car.

Finally I add a 2K poking over the top of the green screen with Steel Blue gel, as a gratuitous hair-light.

Most of the night’s work is handheld, often with two cameras, but we also get some dolly shots, moving towards or away from the car, again to simulate movement.

 

Day 17

More green screen work today. At the end of the night we recreate one of the scenes from the boat with a piece of railing against the green screen. I do exactly the same lighting as before – Steel Blue three-quarter backlight, and a tungsten key bounced off polyboard. I love the way the actors’ skin looks under this light. Tungsten bounced off polyboard may just be the best light source ever.

 

Day 18

Stage scenes on real sets today, one of which is meant to be on the riverboat. The grips come up with a gag where we shine moonlight through an off-camera window gobo, which they handbash back and forth to simulate the boat rocking. We end up dialling it down so it’s very subtle, but still adds a hint of movement.

We move to the caboose (guard’s van), one of the train carriage sets. A second branch-a-loris is constructed so that both windows on one side of the carriage can have the passing trees effect cutting up the hard fresnel “moonlight”. We light from the other side with Kinos, and add a 1K baby bounced off foamcore to represent light from a practical oil lamp. Later the dialogue transitions to a fight scene, and we replace the bounced baby with an LED panel so it’s a little easier to move around and keep out of shot. I get to do some energetic handheld camerawork following the action, which is always fun.

 

View this post on Instagram

 

A post shared by Neil Oseman (@neiloseman) on

 

Day 27

Interiors on stage, followed by night exteriors out the back of the studio. One of these is a shot of the heroes running, supposedly towards the train. It’s shot from the back of the 1st AD’s pick-up truck as we drive next to them. We have no condor today so the 12K backlight is just on a roadrunner stand, flooding out across the marsh between the lamp and the talent. With smoke it looks great, but lens flare keeps creeping in because the lamp’s not high enough.

We also shoot some Poor Man’s Process around a small set of the rear of a train car. Two lamps with branch-a-lorises in front of them, wind, smoke and shaky cameras help sell the movement.

A post shared by Neil Oseman (@neiloseman) on

Later we have a POV shot of a train screeching to a stop in front of the villain. The camera is on a dolly and the G&E team mount a 2K on there as well, to represent the train’s headlight.

Next week I’ll turn my attention to The Little Mermaid‘s smaller scenes, and discuss how the principle of lighting from the back was applied to them. Meanwhile, if you’re interested in some techniques for shooting in genuinely-moving vehicles, check out my blog from week three of Above the Clouds where we shot on Longcross Studios’ test track, and my article “Int. Car – Moving”.

“The Little Mermaid”: Boats, Trains and Automobiles

Lighting with LED Screens

Gravity’s LED light box

LED lighting has found its way onto most sets now, but there is another off-shoot of LED technology which I see cropping up more and more in American Cinematographer articles. Sometimes it’s lighting, sometimes it’s a special effect, and often it’s both. I’m talking about LED screens: huge LED panels that, rather than emitting solid, constant light, display a moving image like a giant monitor.

I touched on LED Screens in my article about shooting on moving trains, and moving backgrounds do seem to be one of the most common uses for these screens. House of Cards has been in the news this week for all the wrong reasons, but it remains a useful example here. Production designer Steve Arnold describes the use of LED screens for car scenes in the political drama:

We had a camera crew go to Washington, D.C. to drive around and shoot plates for what you see outside when you’re driving. And that is fed into the LED screens above the car. So as the scene is progressing, the LED screens are synched up to emit interactive light to match the light conditions you see in the scenery you’re driving past (that will be added in post). All the reflections on the car windows, the window frames and door jambs is being shot while we’re shooting the actors in the car. Then in post the green screens are replaced with the synced up driving plates, and it works really well. It gives you the sense of light passing over the actors’ faces, matching the lighting that is in the image of the plate.

The green-screen stage used for a car scenes on House of Cards, complete with LED screens for interactive lighting.

This appears to be the go-to method for shooting car scenes now, and more exotic forms of transport are using the technique as well. Rogue One employed “a massive array of WinVision Air 9mm LED panels” to create “an interactive hyperspace lighting effect” (American Cinematographer, February 2017).

The hyperspace VFX is displayed on a huge LED screen on the set of Rogue One.

Production designer Doug Chiang comments on the use of LED screens in the Death Star command centre:

We wanted to see things on the viewscreen where traditionally it would have been a giant bluescreen; we wanted the interactive reflective quality of what you would actually see. Even though we ultimately had to replace some of those images with higher-fidelity images in postproduction, they were enough to give a sense that the quality of light on the actors and the reflections on the set looked and felt very real.

One of the first major uses of LED screens for lighting was in the seminal stranded-in-space thriller Gravity. Concerned about blending the actors convincingly with the CGI backgrounds, DP Emmanuel Lubezki, ASC, AMC came up with a solution that was, at the time, cutting-edge: “I had the idea to build a set out of LED panels and to light the actors’ faces inside it with the previs animation.” (AC, November 2013)

Gravity also featured a scene in which Sandra Bullock’s character puts out a fire, and here once again LED panels provided interactive light. This is a technique that has since been used on several other films to simulate off-camera fires, including Christopher Nolan’s Dunkirk, and the true story of the BP oil rig disaster, Deepwater Horizon.

An LED screen in use on Dunkirk

Traditionally, fire has been simulated with tungsten sources, often Maxibrutes, but on Deepwater Horizon these were relegated to background action, while foregrounds were keyed by a huge 42’x24′ video wall made up of 252 LED panels.  DP Enrqiue Chediak, ASC had this to say (in AC, October 2016):

Fire caused by burning oil is very red and has deep blacks. You cannot get that with the substance that the special effects crews use – all those propane fires are yellow. Oil fire has a very specific quality, and I wanted to reach that. It was important to feel the sense of hell.

By playing back footage of real oil fires on the video wall, Chediak was able to get the realistic colour of lighting he wanted, while retaining authentic dynamics.

The giant LED wall on Deepwater Horizon

This technique isn’t necessarily confined to big-budget productions. In theory you could create interactive lighting with an iPad. For example, a tight shot of an actor supposedly warming themselves by a fireplace; if you could get the iPad close enough, playing a video of flames, I imagine the result would be quite convincing. Has anyone out there tried something like this? Let me know if you have!

I’ll leave you with a music video I shot a few years back (more info here), featuring custom-built LED panels in the background.

Lighting with LED Screens

Lighting I Like: “12 Monkeys”

The latest episode of Lighting I Like is out, analysing how the “Splinter Chamber” set is lit in time travel thriller 12 Monkeys. This adaptation of the Terry Gilliam movie can be seen on Netflix in the UK.

I found out lots about the lighting of this scene from this article on the American Society of Cinematographers website. It didn’t mention the source inside the time machine though, but my guess is that it’s a Panibeam 70, as used in the Cine Reflect Lighting System.

New episodes of Lighting I Like are released at 8pm BST every Wednesday. Next week I’ll look at two scenes from PreacherClick here to see the playlist of all Lighting I Like episodes.

Lighting I Like: “12 Monkeys”

How to do Scenes on a Moving Train

Behind the scenes of “Last Passenger”

The publicity machine is ramping up for Kenneth Branagh’s Murder on the Orient Express remake, and it’s got me thinking about the challenges of a script set largely on a moving train. There are a number of ways of realising such scenes, and today I’m going to look at five movies that demonstrate different techniques. All of these methods are equally applicable to scenes in cars or any other moving vehicle.

1. For Real: “The Darjeeling limited”

Wes Anderson’s 2007 film The Darjeeling Limited sees three brothers embarking on a spiritual railway journey across India. Many of the usual Anderson tropes are present and correct – linear tracking shots, comical headgear, Jason Schwartzman – but surprisingly the moving train wasn’t done with some kind of cutesy stop-motion. Production designer Mark Friedberg explains:

The big creative decision Wes made was that we were going to shoot this movie on a moving train. And all that does is complicate life. It makes it more expensive, it makes the logistics impossible. It made it incredibly difficult to figure out how many crew, what crew, what gear… but what it did do is it made it real.

Kenneth Branagh has stated that at least some of Murder on the Orient Express was shot on a real moving train too:

They painstakingly built a fully functioning period authentic locomotive and carriages from the Orient Express during the golden, glamorous age of travel. It was a train that moved… All of our actors were passengers on the train down the leafy lanes of Surrey, pretending to be the former Yugoslavia.

 

2. Poor Man’s Process: “The Double”

Director Richard Ayoade

Although best known as The IT Crowd‘s Moss and the new host of the Crystal Maze, Richard Ayoade is also an accomplished director. His last feature was a darkly beautiful adaptation of Dostoyevsky’s classic identity-crisis novella The Double. 

Unlike the other movies on this list, The Double only has short sequences on a train, and that’s a key point. So named because it’s a cheap alternative to rear projection (a.k.a. process photography), Poor Man’s Process is a big cheat. In order to hide the lack of motion, you keep the view outside your vehicle’s windows blank and featureless – typically a night sky, but a black subway tunnel or a grey daytime sky can also work. Then you create the illusion of motion with dynamic lighting, a shaky camera, and grips rocking the carriage on its suspension. Used judiciously, this technique can be very convincing, but you would never get away with it for a whole movie.

Poor Man’s works particularly well in The Double, the black void outside the subway car playing into the oppressive and nightmarish tone of the whole film. In an interview with Pushing Pixels, production designer David Crank explains how the subway carriage set was built out of an old bus. He goes on to describe how the appearance of movement was created:

We put the forks of a forklift under the front of the bus, and shook it… For the effect of moving lights outside the train, it was a combination of some spinning lights on stands, as well as lights on small rolling platforms which tracked back and forth down the outside of the bus.

Part 2 of the Darjeeling Limited featurette above reveals that Poor Man’s Process was also used occasionally on that film, when the train was stuck in a siding due to heavy rail traffic. I used Poor Man’s myself for night-time train sequences in two no-budget features that I made in the early noughties – see the BTS clip below – and I’ve also written a couple of blog posts in the past about my use of the same technique on a promotional video and in a fantasy web series.

 

3. Green screen: “Source Code”

Duncan “Zowie Bowie” Jones followed up his low-budget masterpiece Moon with Hollywood sci-fi thriller Source Code, a sort of mash-up of Quantum Leap and Groundhog Day with a chilling twist. It takes place predominantly on a Chicago-bound commuter train, in reality a set surrounded by green screen. In the featurette above, Jones mentions that shooting on a real moving train was considered, but ultimately rejected in favour of the flexibility of working on stage:

Because we revisit an event multiple times, it was absolutely integral to making it work, and for the audience not to get bored, that we were able to vary the visuals. And in order to do that we had to be able to build platforms outside of the train and be able to really vary the camera angles.

In the DVD commentary, Jones also notes that the background plates were shot in post from a real train “loaded up with cameras”.

Director Duncan Jones on the set of “Source Code”

Cinematographer Don Burgess, ASC discusses lighting the fake train in a Panavision article:

It’s difficult to make it feel like natural light is coming in and still get the sense of movement on a train… We worked with computer programs where we actually move the light itself, and brighten and dim the lights so it feels as if you are travelling… The lights are never 100% constant.

When I shot The Little Mermaid last year we did some train material against green screen. To make the lighting dynamic, the grips built “branch-a-loris” rigs: windmills of tree branches which they would spin in front of the lamps to create passing shadows.

 

4. Rear projection: “Last Passenger”

Perhaps the most low-budget film on this list, Last Passenger is a 2013 independent thriller set aboard a runaway train. Director Omid Nooshin and DP Angus Hudson wanted a vintage look, choosing Cooke Xtal anamorphic lenses and a visual effects technique that had long since fallen out of favour: rear projection.

Before the advent of optical – and later digital – compositing, rear projection was commonly used to provide moving backgrounds for scenes in vehicles. The principle is simple: the pre-recorded backgrounds are projected onto a screen like this…

Rear projection in use on “River of no Return” (1954)

Hudson goes into further detail on the technique as used for the Last Passenger:

To capture [the backgrounds] within our limited means, we ended up shooting from a real train using six Canon 5D cameras, rigged in such a way that we got forward, sideways and rear-facing views out of the train at the same time. We captured a huge amount of footage, hours and hours of footage. That allowed us to essentially have 270 degrees of travelling shots, all of which were interlinked.

Because rear projection is an in-camera technique, Nooshin and Hudson were able to have dirt and water droplets on the windows without worrying about creating a compositing nightmare in postproduction. Hudson also notes that the cast loved being able to see the backgrounds and react to them in real time.

 

5. L.E.D. Panels: “Train to Busan”

Enabling the actors to see the background plates was also a concern for Yeon Sang-ho, director of the hit Korean zombie movie Train to Busan. He felt that green screen would make it “difficult to portray the reality”, so he turned to the latest technology: LED screens. This must have made life easier not just for the cast, but for the cinematographer as well.

You see, when you travel by train in the daytime, most of the light inside the carriage comes from outside. Some of it is toplight from the big, flat sky, and some of it is hard light from the sun – both of these can be faked, as we’ve seen – but a lot of the light is reflected, bouncing off trees, houses, fields and all the other things that are zipping by. This is very difficult to simulate with traditional means, but with big, bright LED screens you get this interactive lighting for free. Because of this, and the lack of postproduction work required, this technique is becoming very popular for car and train scenes throughout the film and TV industry.

This brings us back to Murder on the Orient Express, for which 2,000 LED screens were reportedly employed. In a Digital Spy article, Branagh notes that this simulated motion had an unintended side effect:

It was curious that on the first day we used our gimballed train sets and our LED screens with footage that we’d gone to great trouble to shoot for the various environments – the lowlands and then the Alps, etc… people really did feel quite sick.

I’ll leave you with one final point of interest: some of the above films designed custom camera tracks into their train carriage sets. On Last Passenger, for example, the camera hung from a dolly which straddled the overhead luggage racks, while The Darjeeling Limited had an I-beam track designed into the centre of the ceiling. Non-train movies like Speed have used the same technique to capture dolly shots in the confines of a moving vehicle.

“Last Passenger”s luggage rack dolly
How to do Scenes on a Moving Train