How to Make High-end TV During a Pandemic

Many productions are up and running again, and a recent ScreenSkills seminar revealed how two high-end series were amongst the first to tackle TV-making during a global pandemic.

Death in Paradise is a long-running crime drama about fish-out-of-water British detectives – the latest played by Ralf Little – heading murder investigations on the fictional Caribbean island of Saint Marie. Production of the show’s tenth season, originally scheduled for April, commenced instead in late July.

The Pursuit of Love is a mini-series based on the novel by Nancy Mitford, set between the two world wars. Lily James and Emily Beecham star as women in quest of husbands, in an adaptation written and directed by Emily Mortimer. Filming again began in late July, in South West England.

What both productions have in common, and a key reason why they were able to start up ahead of so many others, is that their insurance was already in place before lockdown hit. The policies include producer’s indemnity, covering costs outside of the production’s control.

Co-executive producer Alex Jones of Red Planet Pictures explained that Death in Paradise had a few other things going for it too. Most obvious of these was the location, the French archipelago of Guadeloupe, which formed a natural bubble. All cast and crew were tested for Covid-19 before flying out, then again seven days after arrival and at the start of each filming block. Having been around for ten years made adapting the production easier than starting one from scratch, Jones believes.

Ian Hogan, line producer of The Pursuit of Love, did not have the advantage of an established machine. He said that a full-time health and safety adviser with a background in location management spent weeks working out Coronavirus protocols for the period drama. Crew members each received a copy of these, and were required to agree that they would not go out in their spare time except for exercise and essential shopping. Every day they must declare remotely that they have no symptoms of Covid-19 before they can receive a green pass which allows them through location security. They must then take a temperature test before accessing the set.

Both producers insist that age and underlying health problems are not a barrier to work. Cast and crew who are particularly vulnerable to Covid-19 are given a personalised risk assessment with mitigation steps to follow.

Death in Paradise chose to film using the “one metre plus” social distancing rule common to both France and England. A former assistant director was hired as a Covid supervisor, a role which sometimes involved helping to re-block scenes to avoid physical proximity.

But for The Pursuit of Love, as the title suggests, intimacy was crucial. The producers opted for a close-contact system, dividing personnel into cohorts. A mobile testing lab with a capacity of 70 a day is always on location, and everyone is checked at least once a week. The Director’s Cohort – consisting of Mortimer, the cast, and key on-set crew like the DP, boom op and focus puller – are tested twice a week.

A monitor signal is distributed wirelessly around the set to production iPads and personal devices, to prevent a crowded video village. The DIT sends this camera feed via a local wifi network using Qtake.

Both productions require face-coverings. At least one director of Death in Paradise switched from a mask to a visor so that their cast and crew could read their facial expressions, so important when giving notes.

Visors are also used for close-contact work like make-up and costume, the two departments perhaps most affected by the pandemic. Hogan hired extra make-up trucks so that the chairs could be sufficiently spaced, and both productions expanded their crews to obviate the need for dailies. Instead, extra MUAs and dressers might be engaged for eight weeks out of 12, but on an exclusive basis so that they don’t risk spreading the virus to or from other sets.

Wardrobe fitting for supporting artists is much more involved than usual, as the same costume cannot be tried on multiple people without cleaning in-between. Greater numbers of costumes must be hired, and measurements that are taken remotely are much more important.

All of this is expensive, of course. Jones estimates it has added 15 per cent to Death in Paradise‘s budget, covered fortunately by the insurance. The pace of filming has slowed, but not as much as might be expected, with just two extra filming days per block, and slightly less coverage recorded than before.

Both Jones and Hogan praised the responsibility and enthusiasm with which their crews returned to work. They are positive about the future of TV production. While there have been fears that Coronavirus would shrink crews, Jones’s has actually grown, with a larger off-set support staff. “Our industry is booming,” he concluded, “and it will continue to boom when this is all over.”

This article first appeared on RedShark News.

How to Make High-end TV During a Pandemic

“Jurassic Park” Retrospective

With the temporary closure of Cineworlds around the UK, the future of theatrical exhibition once more hangs in the balance. But just a couple of months ago cinemas were reopening and people were positive that the industry would recover. One of the classic blockbusters that was re-released to plug the gaps in the release schedule ahead of Christopher Nolan’s Tenet was a certain quite popular film about dinosaurs. I described my trip to see it recently, but let’s put that hideous experience behind us and concentrate on the film itself.

Thanks in no small part to the excellent “making of” book by Don Shay and Jody Duncan, Jurassic Park was a formative experience for the 13-year-old Neil Oseman, setting me irrevocably on the path to filmmaking as a career. So let me take you back in time and behind the scenes of an iconic piece of popcorn fodder.

 

Man creates dinosaurs

Even before author Michael Crichton delivered the manuscript of his new novel in May 1990, Steven Spielberg had expressed an interest in adapting it. A brief bidding war between studios saw Joe Dante (Gremlins), Tim Burton (Batman) and Richard Donner (Superman) in the frame to direct, but Spielberg and Universal Pictures were the victors.

Storyboards by David Lowery. Lots of the film’s storyboards are reproduced in “The Making of Jurassic Park” by Don Shay and Jody Duncan.

The screenplay went through several drafts, first by Crichton himself, then by Malio Scotch Marmo and finally by David Koepp, who would go on to script Mission: Impossible, Spider-Man and Panic Room. Pre-production began long before Koepp finished writing, with Spielberg generating storyboards based directly on scenes from the book so that his team could figure out how they were going to bring the dinosaurs to life.

Inspired by a life-size theme park animatronic of King Kong, Spielberg initially wanted all the dinsoaurs to be full-scale physical creatures throughout. This was quickly recognised as impractical, and instead Stan Winston Studio, creators of the Terminator endoskeleton, the Predator make-up and the fifteen-foot-tall Alien queen, focused on building full-scale hydraulically-actuated dinosaurs that would serve primarily for close-ups and mids.

Stan Winston’s crew with their hydraulic behemoth

Meanwhile, to accomplish the wider shots, Spielberg hired veteran stop-motion animator Phil Tippett, whose prior work included ED-209 in RoboCop, the tauntaun and AT-AT walkers in The Empire Strikes Back, and perhaps most relevantly, the titular creature from Dragonslayer. After producing some beautiful animatics – to give the crew a clearer previsualisation of the action than storyboards could provide – Tippett shot test footage of the “go-motion” process he intended to employ for the real scenes. Whilst this footage greatly improved on traditional stop-motion by incorporating motion blur, it failed to convince Spielberg.

https://youtu.be/_7tUlXz9MrA

At this point, Dennis Muren of Industrial Light and Magic stepped in. Muren was the visual effects supervisor behind the most significant milestones in computer-generated imagery up to that point: the stained-glass knight in Young Sherlock Holmes (1986), the water tendril in The Abyss (1989) and the liquid metal T-1000 in Terminator 2: Judgment Day (1991). When Spielberg saw his test footage – initially just skeletons running in a black void – the fluidity of the movement immediately grabbed the director’s attention. Further tests, culminating in a fully-skinned tyrannosaur stalking a herd of gallimimuses, had Spielberg completely convinced. On seeing the tests himself, Tippett famously quipped: “I think I’m extinct.”

The first CGI test

Tippett continued to work on Jurassic Park, however, ultimately earning a credit as dinosaur supervisor. Manipulating a custom-built armature named the Dinosaur Input Device, Tippett and his team were able to have their hands-on techniques recorded by computer and used to drive the CG models.

Building on his experiences working with the E.T. puppet, Spielberg pushed for realistic animal behaviours, visible breathing, and bird-like movements reflecting the latest paleontological theories, all of which would lend credibility to the dinosaurs. Effects co-supervisor Mark Dippe stated: “We used to go outdoors and run around and pretend we were gallimisuses or T-Rexes hunting each other, and shoot [reference] film.”

 

Dinosaurs eat man

Stan Winston’s triceratops was the first dinosaur to go before the cameras, and the only one to be filmed on location.

Production began in August 1992 with three weeks on the Hawaiian island of Kauai. Filming progressed smoothly until the final day on location, which had to be scrubbed due to Hurrican Iniki (although shots of the storm made it into the finished film). After a brief stint in the Mojave Desert, the crew settled into the stages at Universal Studios and Warner Brothers to record the bulk of the picture.

The most challenging sequence to film would also prove to be the movie’s most memorable: the T-Rex attack on the jeeps containing Sam Neill’s Dr. Grant, Jeff Goldblum’s Ian Malcolm, lawyer Gennaro and the children, Lex and Tim. It was the ultimate test for Stan Winston’s full-scale dinosaurs.

The T-Rex mounted on its motion simulator base on Stage 16 at Warner Brothers

The main T-Rex puppet weighed over six tonnes and was mounted on a flight simulator-style platform that had to be anchored into the bedrock under the soundstage. Although its actions were occasionally pre-programmed, the animal was mostly puppeteered live using something similar to the Dinosaur Input Device.

But the torrential rain in which the scene takes place was anathema to the finely tuned mechanics and electronics of the tyrannosaur. “As [the T-Rex] would get rained on,” Winston explained, “his skin would soak up water, his weight would change, and in the middle of the day he would start having the shakes and we would have to dry him down.”

Although hints of this shaking can be detected by an eagle-eyed viewer, the thrilling impact of the overall sequence was clear to Spielberg, who recognised that the T-Rex was the star of his picture. He hastily rewrote the ending to bring the mighty creature back, relying entirely on CGI for the new climax in which it battles raptors in the visitor centre’s rotunda.

The CGI T-Rex in the rewritten finale

 

Woman inherits the earth

After wrapping 12 days ahead of schedule, Jurassic Park hit US cinemas on June 11th, 1993. It became the highest-grossing film of all time, a title which it would hold until Titanic’s release four years later. 1994’s Oscar ceremony saw the prehistoric blockbuster awarded not only Best Visual Effects but also Best Sound Editing and Best Sound Mixing. Indeed, Gary Rydstrom’s contribution to the film – using everything from a dolphin/walrus combination for the raptors’ calls, to the sound of his own dog playing with a rope toy for the T-Rex – cannot be overstated.

Jurassic Park has spawned four sequels to date (with a fifth on the way), and its impact on visual effects was enormous. For many years afterwards, blockbusters were filled with CGI that was unable to equal, let alone surpass, the quality of Jurassic Park’s. Watching it today, the CGI is still impressive if a little plasticky in texture, but I believe that the full-size animatronics which form the lion’s share of the dinosaurs’ screen time are what truly give the creatures their memorable verisimilitude. The film may be 27 years old, but it’s still every bit as entertaining as it was in 1993.

This article first appeared on RedShark News.

Director of photography Dean Cundey, ASC with the brachiosaur head puppet
“Jurassic Park” Retrospective

5 Ways to Fake Firelight

Real SFX run a fishtail on the set of “Heretiks”

Firelight adds colour and dynamism to any lighting set-up, not to mention being essential for period and fantasy films. But often it’s not practical to use real firelight as your source. Even if you could do it safely, continuity could be a problem.

A production that can afford an experienced SFX crew might be able to employ fishtails, V-shaped gas outlets that produce a highly controllable bar of flame, as we did on Heretiks. If such luxuries are beyond your budget, however, you might need to think about simulating firelight. As my gaffer friend Richard Roberts once said while operating an array of flickering tungsten globes (method no. 3), “There’s nothing like a real fire… and this is nothing like a real fire.”

 

1. Waving Hands

The simplest way to fake firelight is to wave your hands in front of a light source. This will work for any kind of source, hard or soft; just experiment with movements and distances and find out what works best for you. A layer of diffusion on the lamp, another in a frame, and the waving hands in between, perhaps?

Visit my Instagram feed for loads more diagrams like this.

One of my favourite lighting stories involves a big night exterior shot from The First Musketeer which was done at the Chateau de Fumel in the Lot Valley, France. We were just about to turnover when a bunch of automatic floodlights came on, illuminating the front of the chateau and destroying the period illusion of our scene. We all ran around for a while, looking for the off switch, but couldn’t find it. In the end I put orange gel on the floodlights and had someone crouch next to each one, wiggling their hands like a magician, and suddenly the chateau appeared to be lit by burning braziers.

 

2. Wobbling Reflector

This is my go-to technique – quick, easy and effective. It’s demonstrated in my Cinematic Lighting course on Udemy and also in this episode of Lensing Ren:

All you need is a collapsible reflector with a gold side, and an open-face tungsten fixture. Simply point the latter at the former and wobble the reflector during the take to create the flickering effect.

 

3. Tungsten Array

If you want to get more sophisticated, you can create a rig of tungsten units hooked up to a dimmer board. Electronic boxes exist to create a flame-like dimming pattern, but you can also just do it by pushing the sliders up and down randomly. I’ve done this a lot with 100W tungsten globes in simple pendant fittings, clipped to parts of the set or to wooden battens. You can add more dynamics by gelling the individual lamps with different colours – yellows, oranges and reds.

John Higgins’ 2MW firelight rig from “1917”

Larger productions tend to use Brutes, a.k.a. Dinos, a.k.a. 9-lights, which are banks of 1K pars. The zenith of this technique is the two megawatt rig built by gaffer John Higgins for Roger Deakins, CBE, BSC, ASC on 1917.

 

4. Programmed L.E.D.

Technological advances in recent years have provided a couple of new methods of simulating firelight. One of these is the emergence of LED fixtures with built-in effects programmes like police lights, lightning and flames. These units come in all shapes, sizes and price-ranges.

Philip Bloom’s budget fire-effect rig on location for “Filmmaking for Photographers”

On War of the Worlds: The Attack last year, gaffer Callum Begley introduced me to Astera tubes, and we used their flame effect for a campfire scene in the woods when we were having continuity problems with the real fire. For the more financially challenged, domestic fire-effect LED bulbs are cheap and screw into standard sockets. Philip Bloom had a few of these on goose-neck fittings which we used extensively in the fireplaces of Devizes Castle when shooting a filmmaking course for Mzed.

 

5. L.e.D. Screen

A logical extension of an LED panel or bulb that crudely represents the pattern of flames is an LED screen that actually plays video footage of a fire. The oil rig disaster docu-drama Deep Horizon and Christoper Nolan’s Dunkirk are just two films that have used giant screens to create the interactive light of off-camera fires. There are many other uses for LED screens in lighting, which I’ve covered in detail before, with the ultimate evolution being Mandalorian-style virtual volumes.

You don’t necessarily need a huge budget to try this technique. What about playing one of those festive YouTube videos of a crackling log fire on your home TV? For certain shots, especially given the high native ISOs of some cameras today, this might make a pretty convincing firelight effect. For a while now I’ve been meaning to try fire footage on an iPad as a surrogate candle. There is much here to explore.

So remember, there may be no smoke without fire, but there can be firelight without fire.

5 Ways to Fake Firelight

A Post-lockdown Trip to the Cinema

This article first appeared on RedShark News last month.

What’s wrong with this picture? Apparently nothing, if you work for the Light.

As I write this, I’ve just got back from my first trip to the cinema in six months. Although they have been allowed to reopen in England since July 4th, the higher operating costs in the pandemic kept many cinemas dark well into August. On Friday the 21st, my local branch of the Light here in Cambridge finally opened its doors, and I went along to experience post-Covid cinema.

Studios have been shifting their release dates throughout the lockdown, with some films giving up on theatrical exhibition altogether, so the Light, like its competitors, has filled its screens with classics for now. I selected Jurassic Park, which I haven’t seen on the big screen since its original release in 1993.

When I arrived, the lobby was dark and almost empty. Like most public spaces, it had sprouted new signage and a one-way system since March, and it took me a couple of attempts to find the right lane. Once inside the main corridor though, little had changed except the odd hand sanitiser dispenser on the wall.

I found my screen and took a seat. As with everything from trains to swimming pools, pre-booking is now strongly recommended, due to the diminished capacity caused by social distancing. When you pick your seat, the website makes you leave two empties between your party and the next. You can even pre-purchase your popcorn and bucket of cola.

I needn’t have booked, however. In a screen of about 100 seats, exactly ten were occupied. It will take the general public a while to cotton on that cinema-going is an option again, even before they decide whether they feel comfortable doing so.

As I sat masked and expectant, my hands sticky from sanitiser that refused to evaporate, I was treated to a rare site: a cinema employee inside the auditorium. He announced that they didn’t have any ads or trailers yet, so they would delay starting the film to give everyone a chance to arrive.

A few minutes later, the man reappeared and asked us all to decamp to the corridor. Apparently they had installed a new sound system, and they needed to test it, which could be very loud. Why they couldn’t have checked the system for eardrum bursting at some point in the last six months is beyond me.

The ten of us duly waited in the corridor. A snatch of the Imperial March from an adjacent screen betokened another classic being wheeled out. A woman with a spray bottle and a cloth, masked like all of her colleagues, worked her way down the corridor, cleaning the door handles. A group next to me (but, I hasten to add, appropriately distant) cracked jokes about the sex appeal of Jeff Goldblum’s Ian Malcom. Another group, evidently missing the trailers, watched one on a phone. (If that doesn’t sum up the existential crisis facing cinema, I don’t know what does.)

At last we were readmitted. The lights dimmed, the sounds of a jungle faded up on the brand new sound system, and the Universal logo appeared. But the trademark globe looked like a deflated football. The film was being projected in the wrong aspect ratio. And not just slightly. It was almost unwatchably stretched, like the flat 1.85:1 images were being shown through a 2:1 anamorphic lens.

By the time the first scene was dissolving away to Bob Peck’s cries of “Shoot her!” the problem hadn’t been corrected, so I stepped out to find a member of staff. The senior person on duty claimed that the problem lay with the file supplied by the distributor, not with the projection. “There’s nothing I can do,” he insisted, while I goggled over my mask in disbelief.

At this point, had I not had this article to write, I would have gone home and watched the film on Netflix, or even on DVD. (There’s that existential crisis again.) But I persevered, trying not to imagine Dean Cundey weeping tears of frustration into his beard.

Fortunately, Jurassic Park is such a great film that it could be appreciated even in the face of such technical incompetence. A larger audience would have been nice, to enjoy the scares and humour with, though since screaming and laughing project dangerous droplets further, perhaps that’s less than ideal these days.

Overall, I must say that I found the experience of going to the cinema less altered than many other aspects of life. I’ve got used to wearing a mask, so much so that I was halfway home before I remembered to take it off, and I normally avoid peak times so the emptiness didn’t feel too unusual.

But with the rise in streaming subscriptions during lockdown, and the understandable caution that many feel about going out, cinemas will need to work much harder to get bums back on flip-up seats. The kind of technical troubles that the Light suffered tonight will only strengthen the case for staying at home, mask-free and pyjama-clad, where you can control both the virus and the aspect ratio.

A week after writing this, I went to a Showcase to see Tenet. The member of staff who took our tickets unequivocally told us that the printed screen number was wrong, and that we should go to another one. We did so. The ads and trailers finally started, fifteen minutes late. We were just wondering why they were trailing such kid-friendly movies when another member of staff came in and told us that Tenet was showing in the original screen after all, and by the way, you’ve missed the first couple of minutes. 

Hopefully it is now clear why I wrote “10 Reasons Why Cinemas Don’t Deserve to Survive the Pandemic”.

A Post-lockdown Trip to the Cinema

Will “The Mandalorian” Revolutionise Filmmaking?

Last week, Greig Fraser, ASC, ACS and Baz Idoine were awarded the Emmy for Outstanding Cinematography for a Single-camera Series (Half-hour) for The Mandalorian. I haven’t yet seen this Star Wars TV series, but I’ve heard and read plenty about it, and to call it a revolution in filmmaking is not hyperbole.

Half of the series was not shot on location or on sets, but on something called a volume: a stage with walls and ceiling made of LED screens, 20ft tall, 75ft across and encompassing 270° of the space. I’ve written before about using large video screens to provide backgrounds in limited aways, outside of train windows for example, and using them as sources of interactive light, but the volume takes things to a whole new level.

In the past, the drawback of the technology has been one of perspective; it’s a flat, two-dimensional screen. Any camera movement revealed this immediately, because of the lack of parallax. So these screens tended to be kept to the deep background, with limited camera movement, or with plenty of real people and objects in the foreground to draw the eye. The footage shown on the screens was pre-filmed or pre-rendered, just video files being played back.

The Mandalorian‘s system, run by multiple computers simultaneously, is much cleverer. Rather than a video clip, everything is rendered in real time from a pre-built 3D environment known as a load, running on software developed for the gaming industry called Unreal Engine. Around the stage are a number of witness cameras which use infra-red to monitor the movements of the cinema camera in the same way that an actor is performance-captured for a film like Avatar. The data is fed into Unreal Engine, which generates the correct shifts in perspective and sends them to the video walls in real time. The result is that the flat screen appears, from the cinema camera’s point of view, to have all the depth and distance required for the scene.

The loads are created by CG arists working to the production designer’s instructions, and textured with photographs taken at real locations around the world. In at least one case, a miniature set was built by the art department and then digitised. The scene is lit with virtual lights by the DP – all this still during preproduction.

The volume’s 270° of screens, plus two supplementary, moveable screens in the 90° gap behind camera, are big enough and bright enough that they provide most or all of the illumination required to film under. The advantages are obvious. “We can create a perfect environment where you have two minutes to sunset frozen in time for an entire ten-hour day,” Idoine explains. “If we need to do a turnaround, we merely rotate the sky and background, and we’re ready to shoot!”

Traditional lighting fixtures are used minimally on the volume, usually for hard light, which the omni-directional pixels of an LED screen can never reproduce. If the DPs require soft sources beyond what is built into the load, the technicians can turn any off-camera part of the video screens into an area of whatever colour and brightness are required –  a virtual white poly-board or black solid, for example.

A key reason for choosing the volume technology was the reflective nature of the eponymous Mandalorian’s armour. Had the series been shot on a green-screen, reflections in his shiny helmet would have been a nightmare for the compositing team. The volume is also much more actor- and filmmaker-friendly; it’s better for everyone when you can capture things in-camera, rather than trying to imagine what they will look like after postproduction. “It gives the control of cinematography back to the cinematographer,” Idoine remarks. VR headsets mean that he and the director can even do a virtual recce.

The Mandalorian shoots on the Arri Alexa LF (large format), giving a shallow depth of field which helps to avoid moiré problems with the video wall. To ensure accurate chromatic reproduction, the wall was calibrated to the Alexa LF’s colour filter array.

Although the whole system was expensive to set up, once up and running it’s easy to imagine how quickly and cheaply the filmmakers can shoot on any given “set”. The volume has limitations, of course. If the cast need to get within a few feet of a wall, for example, or walk through a door, then that set-piece has to be real. If a scene calls for a lot of direct sunlight, then the crew move outside to the studio backlot. But undoubtedly this technology will improve rapidly, so that it won’t be long before we see films and TV episodes shot entirely on volumes. Perhaps one day it could overtake traditional production methods?

For much more detailed information on shooting The Mandalorian, see this American Cinematographer article.

Will “The Mandalorian” Revolutionise Filmmaking?

5 Things Bob Ross Can Teach Us About Cinematography

I’m certainly glad you could join me today. It’s a fantastic day here and I hope it is wherever you’re at. Are you ready to read a fantastic little blog post? Good, then let’s get started.

For twelve years, across 400 episodes, Bob Ross entertained all generations of Americans with his public access TV series, The Joy of Painting. Although he floated up to join the happy little clouds in 1995, in recent years YouTube and Twitch have brought his shows to a new audience, of which I am a humble member. Bob’s hypnotic, soft-spoken voice, his unfailingly positive attitude, and the magical effects of his wet-on-wet oil-painting technique make his series calming, comforting and captivating in equal measure.

Having watched every episode at least twice now, I’ve noticed several nuggets of Bob Ross wisdom that apply just as well to cinematography as they do to painting.

 

1. “The more plains you have in your painting, the more depth it has… and that’s what brings the happy buck.”

Bob always starts with the background of his scene and paints forward: first the sky with its happy little clouds; then often some almighty mountains; then the little footy hills; some trees way in the distance, barely more than scratches on the canvas; then perhaps a lake, its reflections springing forth impossibly from Bob’s brush; the near bank; and some detailed trees and bushes in the foreground, with a little path winding through them.

“Exile Incessant” (dir. James Reynolds)

Just as with landscape painting, depth is tremendously important in cinematography. Creating a three-dimensional world with a monoscopic camera is a big part of a DP’s job, which starts with composition – shooting towards a window, for example, rather than a wall – and continues with lighting. Depth increases production value, which makes for a happy producer and a happy buck for you when you get hired again.

 

2. “As things get further away from you in a landscape, they get lighter in value.”

Regular Joy of Painting viewers soon notice that the more distant layers of Bob’s paintings use a lot more Titanium White than the closer ones. Bob frequently explains that each layer should be darker and more detailed than the one behind it, “and that’s what creates the illusion of depth”.

“The Gong Fu Connection” (dir. Ted Duran)

Distant objects seem lighter and less contrasty because of a phenomenon called aerial perspectivebasically atmospheric scattering of light. As a DP, you can simulate this by lighting deeper areas of your frame brightly, and keeping closer areas dark. This might be achieved by setting up a flag to provide negative fill to an object in the foreground, or by placing a battery-powered LED fixture at the end of a dark street. The technique works for night scenes and small interiors, just as well as daytime landscapes, even though aerial perspective would never occur there in real life. The viewer’s brain will subconsciously recognise the depth cue and appreciate the three-dimensionality of the set much more.

 

3. “Don’t kill the little misty area; that’s your separator.”

After completing each layer, particularly hills and mountains, Bob takes a clean, dry brush and taps gently along the bottom of it. This has a blurring and fading effect, giving the impression that the base of the layer is dissolving into mist. When he paints the next layer, he takes care to leave a little of this misty area showing behind it.

“Heretiks” (dir. Paul Hyett)

We DPs can add atmos (smoke) to a scene to create separation. Because there will be more atmos between the lens and a distant object than between the lens and a close object, it really aids the eye in identifying different plains. That makes the image both clearer and more aesthetically pleasing. Layers can also be separated with backlight, or a differentiation of tones or colours.

 

4. “You need the dark in order to show the light.”

Hinting at the tragedy in his own life, Bob often underlines the importance of playing dark tones against light ones. “It’s like in life. Gotta have a little sadness once in a while so you know when the good times come,” he wisely remarks, as he taps away at the canvas with his fan-brush, painting in the dark rear leaves of a tree. Then he moves onto the lighter foreground leaves, “but don’t kill your dark areas,” he cautions.

“Closer Each Day” promo (dir. Oliver Park)

If there’s one thing that makes a cinematic image, it’s contrast. It can be very easy to over-light a scene, and it’s often a good idea to try turning a fixture or two off to see if the mood is improved. However bright or dark your scene is, where you don’t put light is just as important as where you do. Flagging a little natural light, blacking out a window, or removing the bubble from a practical can often add a nice bit of shape to the image.

 

5. “Maybe… maybe… maybe… Let’s DROP in an almighty tree.”

As the end of the episode approaches, and the painting seems complete, Bob has a habit of suddenly adding a big ol’ tree down one or both sides of the canvas. Since this covers up background layers that have been carefully constructed earlier in the show, Bob often gets letters complaining that he has spoilt a lovely painting. “Ruined!” is the knowing, light-hearted comment of the modern internet viewer.

“Synced” (dir. Devon Avery)

The function of these trees is to provide a foreground framing element which anchors the side of the image. I discussed this technique in my article on composing a wide shot. A solid, close object along the side or base of the frame makes the image much stronger. It gives a reason for the edge of the frame to be there rather than somewhere else. As DPs, we may not be able to just paint a tree in, but there’s often a fence, a pillar, a window frame, even a supporting artist that we can introduce to the foreground with a little tweaking of the camera position.

The ol’ clock on the wall tells me it’s time to go, so until next time: happy filming, and God bless, my friend.

If you’re keen to learn more about cinematography, don’t forget I have an in-depth course available on Udemy.

5 Things Bob Ross Can Teach Us About Cinematography

The Cinematography of “Chernobyl”

Like many of us, I’ve watched a lot of streaming shows this year. One of the best was Chernobyl, the HBO/Sky Atlantic mini-series about the nuclear power plant disaster of 1986, which I cheekily binged during a free trial of Now TV.

In July, Chernobyl deservedly scooped multiple honours at the Virgin Media British Academy Television (Craft) Awards. In addition to it claiming the Bafta for best mini-series, lead actor Jared Harris, director Johan Renck, director of photography Jakob Ihre, production designers Luke Hull and Claire Levinson-Gendler, costume designer Odile Dicks-Mireaux, editors Simon Smith and Jinx Godfrey, composer Hildur Gudnadóttir, and the sound team all took home the awards in their respective fiction categories.

I use the phrase “took home” figuratively, since no-one had left home in the first place. The craft awards ceremony was a surreal, socially-distanced affair, full of self-filmed, green-screened celebrities. Comedian Rachel Parris impersonated writer/actor Jessica Knappett, and the two mock-argued to present the award for Photography & Lighting: Fiction. Chernobyl’s DP Jakob Ihre, FSF gave his acceptance speech in black tie, despite being filmed on a phone in his living room. In it he thanked his second unit DP Jani-Petteri Passi as well as creator/writer Craig Mazin, one of the few principal players not to receive an award.

Mazin crafted a tense and utterly engrossing story across five hour-long instalments, a story all the more horrifying for its reality. Beginning with the suicide of Harris’ Valery Legasov on the second anniversary of the disaster, the series shifts back to 1986 and straight into the explosion of the No. 4 reactor at the Chernobyl Nuclear Power Plant in the Soviet Ukraine. Legasov, along with Brosi Shcherbina (Stellan Skarsgård) and the fictional, composite character Ulana Khomyuk (Emily Watson) struggle to contain the meltdown while simultaneously investigating its cause. Legions of men are sacrificed to the radiation, wading through coolant water in dark, labyrinthine tunnels to shut off valves, running across what remains of the plant’s rooftop to collect chunks of lethal graphite, and mining in sweltering temperatures beneath the core to install heat exchangers that will prevent another catastrophic explosion.

For Swedish-born NFTS (National Film and Television School) graduate Jakob Ihre, Chernobyl was a first foray into TV. His initial concept for the show’s cinematography was to reflect the machinery of the Soviet Union. He envisaged a heavy camera package representing the apparatus of the state, comprised of an Alexa Studio, with its mechanical shutter, plus anamorphic lenses. “After another two or three months of preproduction,” he told the Arri Channel, “we realised maybe that’s the wrong way to go, and we should actually focus on the characters, on the human beings, the real people who this series is about.”

Sensitivity and respect for the people and their terrible circumstances ultimately became the touchstone for both Ihre and his director. The pair conducted a blind test of ten different lens sets, and both independently selected Cooke Panchros. “We did a U-turn and of course we went for spherical lenses, which in some way are less obtrusive and more subtle,” said Ihre. For the same reason, he chose the Alexa Mini over its big brother. A smaller camera package like this is often selected when filmmakers wish to distract and overwhelm their cast as little as possible, and is believed by many to result in more authentic performances.

When it came to lighting, “We were inspired by the old Soviet murals, where you see the atom, which is often symbolised as a sun with its rays, and you see the workers standing next to that and working hand in hand with the so-called ‘friendly’ atom.” Accordingly, Ihre used light to represent gamma radiation, with characters growing brighter and over-exposed as they approach more dangerous areas.

Ihre thought of the disaster as damaging the fabric of the world, distorting reality. He strove to visualise this through dynamic lighting, with units on dimmers or fitted with remote-controlled shutters. He also allowed the level of atmos (smoke) in a scene to vary – normally a big no-no for continuity. The result is a series in which nothing feels safe or stable.

The DP shot through windows and glass partitions wherever possible, to further suggest a distorted world. Working with Hull and Levinson-Gendler, he tested numerous transparent plastics to find the right one for the curtains in the hospital scenes. In our current reality, filled with perspex partitions (and awards ceremonies shot on phones), such imagery of isolation is eerily prescient.

The subject of an invisible, society-changing killer may have become accidentally topical, but the series’ main theme was more deliberately so. “What is the cost of lies?” asks Legasov. “It’s not that we’ll mistake them for the truth. The real danger is that if we hear enough lies, then we no longer recognise the truth at all.” In our post-truth world, the disinformation, denial and delayed responses surrounding the Chernobyl disaster are uncomfortably familiar.

This article first appeared on RedShark News.

The Cinematography of “Chernobyl”

How is Dynamic Range Measured?

The high dynamic range of the ARRI Alexa Mini allowed me to retain all the sky detail in this shot from “Above the Clouds”.

Recently I’ve been pondering which camera to shoot an upcoming project on, so I consulted the ASC’s comparison chart. Amongst the many specs compared is dynamic range, and I noticed that the ARRI Alexa’s was given as 14+ stops, while the Blackmagic URSA’s is 15. Having used both cameras a fair bit, I can tell you that there’s no way in Hell that the Ursa has a higher dynamic range than the Alexa. So what’s going on here?

 

What is dynamic range?

To put it simply, dynamic range is the level of contrast that an imaging system can handle. To quote Alan Roberts, who we’ll come back to later:

This is normally calculated as the ratio of the exposure which just causes white clipping to the exposure level below which no details can be seen.

A photosite on a digital camera’s sensor outputs a voltage proportional to the amount of light hitting it, but at some point the voltage reaches a maximum, and no matter how much more light you add, it won’t change. At the other end of the scale, a photosite may receive so little light that it outputs no voltage, or at least nothing that’s discernible from the inherent electronic noise in the system. These upper and lower limits of brightness may be narrowed by image processing within the camera, with RAW recording usually retaining the full dynamic range, while linear Rec. 709 severely curtails it.

In photography and cinematography, we measure dynamic range in stops – doublings and halvings of light which I explain fully in this article. One stop is a ratio of 2:1, five stops are 32:1, thirteen stops are almost 10,000:1

It’s worth pausing here to point out the difference between dynamic range and latitude, a term which is sometimes regarded as synonymous, but it’s not. The latitude is a measure of how much the camera can be over- or under-exposed without losing any detail, and is dependent on both the dynamic range of the camera and the dynamic range of the scene. (A low-contrast scene will allow more latitude for incorrect exposure than a high-contrast scene.)

 

Problems of Measurement

Before digital cinema cameras were developed, video had a dynamic range of about seven stops. You could measure this relatively easily by shooting a greyscale chart and observing the waveform of the recorded image to see where the highlights levelled off and the shadows disappeared into the noise floor. With today’s dynamic ranges into double digits, simple charts are no longer practical, because you can’t manufacture white enough paper or black enough ink.

For his excellent video on dynamic range, Filmmaker IQ’s John Hess built a device fitted with a row of 1W LEDs, using layers of neutral density gel to make each one a stop darker than its neighbour. For the purposes of his demonstration, this works fine, but as Phil Rhodes points out on RedShark News, you start running into the issue of the dynamic range of the lens.

It may seem strange to think that a lens has dynamic range, and in the past when I’ve heard other DPs talk about certain glass being more or less contrasty, I admit that I haven’t thought much about what that means. What it means is flare, and not the good anamorphic streak kind, but the general veiling whereby a strong light shining into the lens will raise the overall brightness of the image as it bounces around the different elements. This lifts the shadows, producing a certain amount of milkiness. Even with high contrast lenses, ones which are less prone to veiling, the brightest light on your test device will cause some glare over the darkest one, when measuring the kind of dynamic range today’s cameras enjoy.

 

Manufacturer Measurements

Going back to my original query about the Alexa versus the URSA, let’s see exactly what the manufacturers say. ARRI specifically states that its sensor’s dynamic range is over 14 stops “as measured with the ARRI Dynamic Range Test Chart”. So what is this chart and how does it work? The official sales blurb runs thusly:

The ARRI DRTC-1 is a special test chart and analysis software for measurement of dynamic range and sensitivity of digital cameras. Through a unique stray light reduction concept this system is able to accurately measure up to 15.5 stops of dynamic range.

The “stray light reduction” is presumably to reduce the veiling mentioned earlier and provide more accurate results. This could be as simple as covering or turning off the brighter lights when measuring the dimmer ones.

I found a bit more information about the test chart in a 2011 camera shoot-out video, from that momentous time when digital was supplanting film as the cinematic acquisition format of choice. Rather than John Hess’s ND gel technique, the DRTC-1 opts for something else to regulate its light output, as ARRI’s Michael Bravin explains in the video:

There’s a piece of motion picture film behind it that’s checked with a densitometer, and what you do is you set the exposure for your camera, and where you lose detail in the vertical and horizontal lines is your clipping point, and where you lose detail because of noise in the shadow areas is your lowest exposure… and in between you end up finding the number of stops of dynamic range.

Blackmagic Design do not state how they measure the dynamic range of their cameras, but it may be a DSC Labs Xlya. This illuminated chart boasts a shutter system which “allows users to isolate and evaluate individual steps”, plus a “stepped xylophone shape” to minimise flare problems.

Art Adams, a cinema lens specialist at ARRI, and someone who’s frequently quoted in Blain Brown’s Cinematography: Theory & Practice, told Y.M. Cinema Magazine:

I used to do a lot of consulting with DSC Labs, who make camera test charts, so I own a 20-stop dynamic range chart (DSC Labs Xyla). This is what most manufacturers use to test dynamic range (although not ARRI, because our engineers don’t feel it’s precise enough) and I see what companies claim as usable stops. You can see that they are just barely above the noise floor.

 

Conclusions

Obviously these ARRI folks I keep quoting may be biased. I wanted to find an independent test that measures both Blackmagics and Alexas with the same conditions and methodology, but I couldn’t find one. There is plenty of anecdotal evidence that Alexas have a bigger dynamic range, in fact that’s widely accepted as fact, but quantifying the difference is harder. The most solid thing I could find is this, from a 2017 article about the Blackmagic Ursa Mini 4.6K (first generation):

The camera was measured at just over 14 stops of dynamic range in RAW 4:1 [and 13 stops in ProRes]. This is a good result, especially considering the price of the camera. To put this into perspective Alan measured the Canon C300 mkII at 15 stops of dynamic range. Both the URSA Mini 4.6 and C300 mkII are bettered by the ARRI Alexa and Amira, but then that comes as no surprise given their reputation and price.

The Alan mentioned is Alan Roberts, something of a legend when it comes to testing cameras. It is interesting to note that he is one of the key players behind the TLCI (Television Lighting Consistency Index), a mooted replacement for CRI (Colour Rendering Index). It’s interesting because this whole dynamic range business is starting to remind me of my investigation into CRI, and is leading me to a similar conclusion, that the numbers which the manufacturers give you are all but useless in real-world cinematography.

Whereas CRI at least has a standardised test, there’s no such thing for dynamic range. Therefore, until there is more transparency from manufacturers about how they measure it, I’d recommend ignoring their published values. As always when choosing a camera, shoot your own tests if at all possible. Even the most reliable numbers can’t tell you whether you’re going to like a camera’s look or not, or whether it’s right for the story you want to tell.

When tests aren’t possible, and I know that’s often the case in low-budget land, at least try to find an independent comparison. I’ll leave you with this video from the Slanted Lens, which compares the URSA Mini Pro G2 with the ARRI Amira (which uses the same Alev III sensor as the Alexa). They don’t measure the dynamic range, but you can at least see the images side by side, and in the end it’s the images that matter, not the numbers.

How is Dynamic Range Measured?

10 Reasons Why Cinemas Don’t Deserve to Survive the Pandemic

I know that as a professional director of photography I should want cinemas to recover and flourish. After all, even if many of the productions I work on don’t get a theatrical release, my livelihood must still be in some indirect way tied to the methods of exhibition, of which cinema is a foundational pillar. But I think we’ve reached the point where the film industry could survive the death of fleapits, and I’m starting to think that wouldn’t be such a bad thing.

Disclaimer: I’m writing this from a place of anger. Last Friday, the day that the cinemas of Cambridge reopened, I went along to the Light for a screening of Jurassic Park. The experience – which I shall detail fully in a future post – reminded me why going to the cinema can often be frustrating or disappointing. Since lockdown we’ve added the risk of deadly infection to the downsides, and before long we’ll have to add huge price hikes, the inevitable consequence of all those empty seats between households. (Controversially, I think that current ticket prices are reasonable.)

Setting Covid-19 to one side for the moment, here are ten long-standing reasons why cinemas deserve to be put out of their misery.

 

1. No real film any more

My faith in cinema was seriously shaken in the early 2010s when 35mm projection was binned in favour of digital. Some may prefer the crisp quality of electronic images, but for me the magic was in the weave, the dirt, the cigarette burns. The more like real life it looks, the less appeal it holds.

 

2. Adverts

I’m not sure what’s worse, the adverts themselves, or the people who aim to arrive after the adverts and overshoot, spoiling the first few minutes of the movie by walking in front of the screen as they come in late.

 

3. No ushers

Yes, I’m old enough to remember ushers in cinemas, just as I’m old enough to remember when supermarket shelf-stackers waited until the shop was closed before infesting the aisles. (Perhaps the unwanted stackers could be seconded to the needy cinema auditoria?) It’s not that I need a waistcoated teenager with a torch to show me to my seat, but I do need them there to discourage the range of antisocial behaviours in the next three points.

 

4. People eating noisily

I understand that the economics make it unavoidable for cinemas to supplement their income by selling overpriced snacks. But do they have to sell such noisy ones? Is it beyond the wit of humanity to develop quieter packaging? Or for the gluttons to chomp and rustle a little less energetically, especially during the softer scenes?

 

5. People chatting

One of the Harry Potter films was ruined by a child behind me constantly asking his mum what was happening… and his mum answering in great detail every time. Serves me right for going to a kids’ film, perhaps, but you never know what kind of movie might be spoiled by unwanted additional dialogue. I recall a very unpopular individual who answered his phone during The Last Jedi. And I’m sure we’ve all experienced that most maddening of all cinema phenomena: the people who inexplicably attend purely to hold conversations with each other, often conversations that aren’t even related to the film.

(5a. People snoring – a signficant drawback of Vue’s recliner seats.)

 

6. People looking at their phones

“The light from your phone can be distracting too,” say the announcements, and they’re not wrong. Basically, the biggest problem with cinemas is people.

 

7. Arctic air conditioning

Why is cinema air con always turned up so high? No matter how hot it is outside, you always have to take a jacket to keep off the artifical chill in the auditorium.

 

8. Small screens

Home TV screens have been getting bigger for years, so why are cinema screens going the opposite way? Shouldn’t cinemas be trying to give their customers something they can’t experience at home? There’s nothing more disappointing than shelling out for a ticket and walking into the auditorium to see a screen the size of a postage stamp.

 

9. Bad projection

The purpose of going to the cinema is to see a movie projected at the highest possible technical quality by competent professionals, but the reality is often far from that. Stretched, cropped, faint or blurry images – I’ve witnessed the whole gamut of crimes against cinematography. The projectionists seem poorly trained, unfairly lumbered with multiple screens, and locked out of making crucial adjustments to the sound and picture. And because there are no ushers, it’s up to you to miss a couple of minutes of the movie by stepping outside to find someone to complain to.

 

10. Netflix is better

This is the killer. This is what will ultimately bring cinemas down. TV used to be film’s poorer cousin, but these days long-form streaming shows are better written, better photographed and infinitely more engaging than most of what traditional filmmakers seem able to create. Maybe it’s just that I’m middle-aged now, and movies are still being made exclusively for 16-25-year-olds, but it’s rare for a film to excite me the way a series can.

Having said all of that, Christopher Nolan’s Tenet is out on Wednesday. Now that’s something I am looking forward to, if I can just find somewhere showing it on 70mm….

10 Reasons Why Cinemas Don’t Deserve to Survive the Pandemic

10 Clever Camera Tricks in “Aliens”

In 1983, up-and-coming director James Cameron was hired to script a sequel to Ridley Scott’s 1979 hit Alien. He had to pause halfway through to shoot The Terminator, but the subsequent success of that movie, along with the eventually completed Aliens screenplay, so impressed the powers that be at Fox that they greenlit the film with the relatively inexperienced 31-year-old at the helm.

Although the sequel was awarded a budget of $18.5 million – $7.5 million more than Scott’s original – that was still tight given the much more expansive and ambitious nature of Cameron’s script. Consequently, the director and his team had to come up with some clever tricks to put their vision on celluloid.

 

1. Mirror Image

When contact is lost with the Hadley’s Hope colony on LV-426, Ripley (Sigourney Weaver) is hired as a sort of alien-consultant to a team of crack marines. The hypersleep capsules from which the team emerge on reaching the planet were expensive to build. Production designer Peter Lamont’s solution was to make just half of them, and place a mirror at the end of the set to double them up.

 

2. Small Screens

Wide shots of Hadley’s Hope were accomplished with fifth-scale miniatures by Robert and Dennis Skotak of 4-Ward Productions. Although impressive, sprawling across two Pinewood stages, the models didn’t always convince. To help, the crew often downgraded the images by showing them on TV monitors, complete with analogue glitching, or by shooting through practical smoke and rain.

 

3. Big Screens

The filmmakers opted for rear projection to show views out of cockpit windscreens and colony windows. This worked out cheaper than blue-screen composites, and allowed for dirt and condensation on the glass, which would have been impossible to key optically. Rear projection was also employed for the crash of the dropship – the marines’ getaway vehicle – permitting camera dynamics that again were not possible with compositing technology of the time.

 

4. Back to Front

A highlight of Aliens is the terrifying scene in which Ripley and her young charge Newt (Carrie Henn) are trapped in a room with two facehuggers, deliberately set loose by sinister Company man Carter Burke (Paul Reiser). These nightmarish spider-hands were primarily puppets trailing cables to their operators. To portray them leaping onto a chair and then towards camera, a floppy facehugger was placed in its final position and then tugged to the floor with a fishing wire. The film was reversed to create the illusion of a jump.

 

5. Upside Down

Like Scott before him, Cameron was careful to obfuscate the man-in-a-suit nature of the alien drones wherever possible. One technique he used was to film the creatures crawling on the floor, with the camera upside-down so that they appeared to be hanging from the ceiling. This is seen when Michael Biehn’s Hicks peeks through the false ceiling to find out how the motion-tracked aliens can be “inside the room”.

 

6. Flash Frames

All hell (represented by stark red emergency lighting) breaks loose when the aliens drop through the false ceiling. To punch up the visual impact of the movie’s futuristic weapons, strobelights were aimed at the trigger-happy marines. Taking this effect even further, editor Ray Lovejoy spliced individual frames of white leader film into the shots. As a result, the negative cutter remarked that Aliens‘ 12th reel had more cuts than any complete movie he’d ever worked on.

 

7. Cotton Cloud

With most of the marines slaughtered, Ripley heads to the atmospheric processing plant to rescue Newt from the alien nest. Aided by the android Bishop (Lance Henriksen) they escape just before the plant’s nuclear reactor explodes. The ensuing mushroom cloud is a miniature sculpture made of cotton wool and fibreglass, illuminated by an internal lightbulb!

 

8. Hole in the floor

Returning to the orbiting Sulaco, Ripley and friends are ambushed by the stowaway queen, who rips Bishop in half. A pre-split, spring-loaded dummy of Henriksen was constructed for that moment, and was followed by the simple trick of concealing the actor’s legs beneath a hole in the floor. As in the first movie, android blood was represented by milk. This gradually soured as the filming progressed, much to Henriksen’s chagrin as the script required him to be coated in the stuff and even to spit it out of his mouth.

 

9. Big Battle

The alien queen was constructed and operated by Stan Winston Studios as a full-scale puppet. Two puppeteers were concealed inside, while others moved the legs with rods or controlled the crane from which the body hung. The iconic power loader was similar, with a body builder concealed inside and a counter-weighted support rig. This being before the advent of digital wire removal, all the cables and rods had to be obfuscated with smoke and shifting shadows, though they can still be seen on frame grabs like this one. (The queen is one of my Ten Greatest Movie Puppets of All Time.)

 

10. Little Battle

For wide shots of the final fight, both the queen and the power loader were duplicated as quarter scale puppets. Controlled from beneath the miniature set via rods and cables, the puppets could perform big movements, like falling into the airlock, which would have been very difficult with the full-size props. (When the airlock door opens, the starfield beyond is a black sheet with Christmas lights on it!) The two scales cut seamlessly together and produce a thrilling finale to this classic film.

For more on the visual effects of James Cameron movies, see my rundown of the top five low-tech effects in Hollywood films (featuring Titanic) and a breakdown of the submarine chase in The Abyss.

10 Clever Camera Tricks in “Aliens”