“Superman II” Retrospective

At Christmas 1978, when Superman: The Movie opened to enthusiastic reviews and record-breaking box office, it was no surprise that a sequel was in the works. What was unusual was that the majority of that sequel had already been filmed, and stranger still, much of it would be re-filmed before Superman II hit cinemas two years later.

Jerry Siegel and Joe Shuster’s comic-book icon had made several superhuman leaps to the screen by the 1970s, but Superman: The Movie was the first big-budget feature film. Producer Pierre Spengler and executive producer father/son team Alexander and Ilya Salkind purchased the rights from DC Comics in 1974 and made a deal to finance not one but two Superman movies on the understanding that Warner Bros. would buy the finished products. Salkind senior had unintentionally pioneered back-to-back shooting the previous year when he decided to split The Three Musketeers – originally intended as a three-hour epic – into two shorter films.

After packaging Superman I and II with A-listers Marlon Brando (as Kryptonian patriarch Jor-El) and Gene Hackman (as the villainous Lex Luthor), the producers hired The Omen director Richard Donner to helm the massive production. Donner cast the unknown Christopher Reeve in the title role, while John Williams was signed to compose what would prove to be one of the most famous soundtracks in cinematic history. Like many big genre productions of the time – Star Wars and Alien to name but two – Superman set up camp in England, with cameras rolling for the first time on March 24th, 1977.

Tom Mankeiwicz (creative consultant), Marlon Brando (Jor-El), Richard Donner (director), Pierre Spengler (producer). Brando is dressed in black to isolate his head for the Fortress of Solitude hologram effects.

“We were shooting scenes from the two films simultaneously, according to production conveniences,” explained creative consultant Tom Mankiewicz in a 2001 documentary. “So when we had Gene Hackman we were shooting scenes from II and scenes from I, or when we were in the Daily Planet we were shooting scenes from both pictures in the Daily Planet, while you were in that set.”

Today – largely thanks to Peter Jackson’s Lord of the Rings trilogy – we are used to enormous, multi-year productions with crew numbers in four figures, but the scale of the dual Superman shoot was unprecedented at the time, eventually reaching nineteen months in duration. It was originally scheduled for eight.

“Dick [Donner] never in the course of the picture got a budget; he never got a schedule,” claimed Mankiewicz. “He was constantly told that he was over schedule, way over budget, but nobody told him what that budget was or how much he was over that budget.”

Given that overspends were funded by Warner Bros. in return for more distribution rights, Spengler and the Salkinds were watching the value of their huge investment trickle away. So despite Donner’s popularity with the rest of the cast and crew, his relationship with the producers became ever more strained, to the point where they weren’t even on speaking terms.

Richard Lester directed iconic Swinging Sixties films like “A Hard Day’s Night” and “The Knack… and How to Get It”.

Ilya Salkind suggested bringing in The Three Musketeers director Richard Lester, who agreed on condition that he would be paid monies still owed to him from that earlier film. By some accounts his role on Superman was that of a mediator between the director and the producers, by others he was a co-producer, second unit director or even a back-up director in case Donner cracked under the pressure of the endless shoot. “Where does this leave… Donner?” asked a newspaper report of the time. “‘Nervous,’ a cast member says.”

Eventually, with the first movie’s release date looming, the filmmakers decided on a change of plan. Superman II would be placed on the back burner in order to prioritise finishing Superman: The Movie – and get it earning money as quickly as possible. At this point, three quarters of the sequel was already in the can, including all scenes featuring Brando and Hackman, both of whom had had contractual wrap dates to meet.

Superman: The Movie was a hit, but Donner would not direct the remainder of its sequel. “They have to want me to do it,” he said of the producers at the time. “It has to be on my terms and I don’t mean financially, I mean control.” Of Spengler specifically, Donner was reported to bluntly state, “If he’s on it – I’m not.”

And indeed Donner was not. The Salkinds had no intention of acceding to his demands. Instead, the former mediator Richard Lester was hired to complete Superman II, and Donner received a telegram telling him that his services were no longer required. “I was ready to get on an airplane and kill,” he recalled years later, “because they were taking my baby away from me.”

Master of miniatures Derek Meddings wets down the New York street. Many miniature effects were reshot simply so Lester could claim directorship of them.

Meanwhile Brando was trying (unsuccessfully) to sue the producers over royalties, and demanded a significant cut of the box office gross from the sequel. Rather than pay this, the producers elected to re-film his scenes, replacing Jor-El with Superman’s mother Lara, as played by Susannah York.

It was far from the only reshooting of Superman II footage that took place. Ironically, given the earlier budget concerns, Lester was permitted to redo large chunks of Donner’s material with a rewritten script in order to earn a credit as director under guild rules. Major changes included a new opening sequence on the Eiffel Tower, Lois Lane’s realisation of Clark Kent’s true identity after he trips and falls into a fireplace, and a different ending in which a magic kiss from Clark erases that realisation from her memory.

Some of the reshoots included Lex Luthor material, but Hackman declined to return out of loyalty to Donner; the result is the fairly obvious use of a double in the climactic Fortress of Solitude scene. The deaths of Geoffrey Unsworth and John Barry, plus creative differences between Lester and John Williams, meant that the sequel team also featured a new DP (Robert Paynter), production designer (Peter Murton) and composer (Ken Thorne) respectively, although significant contributions from all of the original HODs remain in the finished film.

Comparing his own directing style with Donner’s, Lester told interviewers, “I think that Donner was emphasising a kind of grandiose myth… There was a type of epic quality which isn’t in my nature… I’m more quirky and I play around with slightly more unexpected silliness.” Indeed his material is characterised by visual gags and a generally less serious approach, which he would continue into Superman III (1983).

Although some of the unused Donner scenes were incorporated into TV screenings over the years, it was not until the 2001 DVD restoration of the first movie that interest began to build in a release for the full, unseen version of the sequel. When Brando’s footage was rediscovered a few years later, it could finally become a reality.

Footage from Margot Kidder’s 35mm screen test was incorporated into the Donner Cut to show Lois Lane discovering Clark Kent’s true identity. Although causing some minor continuity errors, the scene is far more intelligent than Lester’s rug-tripping revelation.

“I don’t think there is [another] film that had so much footage shot and not used,” remarked editor Michael Thau. A vast cataloguing and restoration effort was undertaken to make useable the footage which had been sitting in Technicolor’s London vault for a quarter of a century. Donner and Mankiewicz returned to oversee and approve the process, which used only the minimum of Lester material necessary to tell a complete story, plus footage from Reeve’s and Margot Kidder’s 35mm screen tests.

Released on DVD in 2006, the Donner Cut suffers from the odd cheap visual effect used to plug plot holes, and a familiar turning-back-time ending which was originally scripted for the sequel but moved to the first film at the last minute. However, for fans of Superman: The Movie, this version of Superman II is much closer in tone and ties in much better in story terms too. The Donner Cut is also less silly than the theatrical version, though it must be said that Lester’s humour contributed in no small part to the sequel’s original success.

Whichever version you prefer, 40 years on from its first release, Superman II is still a fun and thrilling adventure with impressive visuals and an utterly believable central performance from the late, great Christopher Reeve.

“Superman II” Retrospective

Finding the Positive in 2020

This year hasn’t been great for anyone. It certainly hasn’t for me. Even as I’m writing this I’m hearing the news that, in a staggeringly foreseeable U-turn, the Christmas bubble rules have been severely restricted. So how to wrap up this stinker of a year?

I considered making this article about the pandemic’s impact on the film and TV industries and speculating about which changes might be permanent. But who needs more doom and gloom? Not me.

Instead, here are six positive things that I accomplished this year:

  1. We shot the final block of War of the Worlds: The Attack in February/March, and I was recently shown a top-secret trailer which is very exciting. There is plenty of post work still to do on this modern-day reimagining of the H.G. Wells classic, but hopefully it will see the light of day in a year or so.
  2. After a couple of lax years, I got back to blogging regularly. This site now has a staggering 1,250 posts!
  3. I completed and released my first online course, Cinematic Lighting, which has proven very popular. It currently has over 1,000 students and a star rating which has consistently hovered around 4.5 out of 5.
  4. I made a zoetrope and shot several 35mm timelapses and animations for it, which was a nice lockdown project. Even though the animations didn’t come out that well they were fun to do, and the zoetrope itself is just a cool object to keep on the shelf.
  5. I wrote my first article for British Cinematographer, the official magazine of the BSC, which will be published on January 15th. In the process I got to interview (albeit by email) several DPs I’ve admired for a while including David Higgs BSC, Colin Watkinson ASC, BSC and Benedict Spence.
  6. The lockdown gave me the time to update my showreel. Who knows if I’ll ever work again as a DP, but if not, at least I can look back with pride at some of the images I’ve captured over the years.

Despite the restrictions, I hope all my readers manage to find some joy, love and comfort over the festive period. And if not, just consume a lot of mulled wine and chocolate; it’s the next best thing.

In a tradition I’ve neglected for a few years, I’ll leave you with a rundown of my ten favourite blog posts from 2020.

  1. “The Rise of Anamorphic Lenses in TV” – a look at some of the shows embracing oval bokeh and horizontal flares
  2. “5 Steps to Lighting a Forest at Night” – breaking down how to light a place that realistically shouldn’t have any light
  3. Above the Clouds: The Spoiler Blogs” – including how we faked a plane scene with a tiny set-piece in the director’s living room
  4. “Working with White Walls” – analysing a couple of short films where I found ways to make the white-walled locations look more cinematic
  5. “10 Clever Camera Tricks in Aliens – the genius of vintage James Cameron
  6. “The Cinematography of Chernobyl – how DP Jakob Ihre used lighting and lensing to tell this horrifying true story
  7. “5 Things Bob Ross Can Teach Us About Cinematography” – who knew that the soft-spoken painter had so much movie-making widsom?
  8. “5 Ways to Fake Firelight” – a range of ways to simulate the dynamics of flames, from the low-tech to the cutting edge
  9. “A Post-lockdown Trip to the Cinema” – an account of the projection sacrilege commited against a classic movie in my first fleapit trip of the Covid era
  10. “Exposure” (four-part series) – in-depth explanations of aperture, ND filters, shutter angles and ISO
Finding the Positive in 2020

A Cinematographer’s Guide to Looking Good on a Webcam

This night shot is lit by a desk-lamp bounced off the wall behind and to the left of camera, plus the monitor light and the Christmas lights you can see at frame left. The background is lit by a reading lamp bounced off the ceiling (just out of frame right) and more Christmas lights.

It may be the beginning of the end for Covid-19, but it doesn’t look like home working and video calls are going away any time soon. We’re very lucky that we have such technology in the midst of a global pandemic, but let’s be honest: webcams don’t always make us look our best. Having lit and photographed movies for 20 years, I’d like to share a few tips to improve your image on Zoom, WhatsApp, Google Meet or whatever your video call software of choice is.

Firstly, low camera angles are not flattering to many people. Wherever possible, set up your webcam so that it’s at eye-level or a little above. If you’re using a laptop, this might mean stacking a few books under the device. Consider investing in a laptop stand that will raise the monitor and camera up if you’re going to be doing this a lot.

Avoid placing the camera too close yourself. A medium close-up works best for most video calls, head and shoulders at the closest, or down to your waist if you like to gesticulate a lot. Follow the classic rules of composition and make the most of your camera’s resolution by framing your head near the top of the shot, rather than leaving a lot of empty headroom above yourself.

It’s important to be aware of automatic exposure if you want to look your best on a webcam. Your camera and/or software continually assess the average luminance in the frame and alter the shutter speed or electronic gain to achieve what they think is the correct exposure. Since webcams have very poor dynamic range – they can’t handle a great deal of contrast within the frame – you should think carefully about what elements in your shot could sway the auto-exposure.

For example, a bright white wall, window or table lamp in the background will cause the camera to reduce its exposure, darkening the overall image and perhaps turning you into a silhouette. Even the colour of top you’re wearing can be a factor. If you have a pale skin tone and you’re wearing a black top – prompting the camera to increase its exposure – you might well find that your face bleaches out.

The black hoodie causes the automatic exposure to rise, bleaching out my face.
The lighter tone of this t-shirt enables the automatic exposure to produce a balanced image.

This brings us to lighting. Most of us are used to lighting our homes and workspaces so that we can see what we’re doing comfortably, rather than worrying about how the light is falling on our own faces.

The clearest and most flattering type of lighting is generally a large, soft source roughly in front of and slightly above us, so if possible position your computer or webcam in front of a window. If direct sunlight comes in through this window, that is less ideal; try to cut it off with your curtains. The indirect light of sky and clouds is much softer and less likely to confuse the auto-exposure.

If you have little or no natural light to work with, the main source of light on your face might well be the monitor you’re looking at. In this case, what you have on your screen can make a huge difference. A blank white Word document is going to light you much more effectively than a paused Netflix frame from a horror movie.

Monitor light can leave you looking blue and ghostly, so consider placing a strategic window of pale orange colour on your virtual desktop to warm up your skin tone. Try adjusting the monitor’s brightness or switching to a darker desktop theme if your monitor is bleaching your face out completely.

Of course, your screen is not just a light source. You need to be able to use it for actually viewing things too, so a better solution is not to rely on it for light. Instead, create another soft source in front of and slightly above you by pointing a desk-lamp at the wall above your monitor. (If the wall is a dark or saturated colour, pin up something white to reflect the light.) The larger the patch of wall the lamp illuminates, the more softly your face will be lit.

You may find that your background now looks very dim, because little of the light from your monitor – or bouncing off the wall behind your monitor – is reaching it. Worse still, the auto-exposure might react to this dim background by over-exposing your face. In this case, use a second lamp to illuminate the background.

Often the room’s main ceiling light will do the job here, though it will likely result in an image that has an overall flat look to it. That might be just what you need for a professional video call, but if not, feel free to get creative with your background. Use table lamps to pick out certain areas, string up fairy lights, or whatever you feel best reflects your personality and profession.

The main thing is to get your “key light” right first – that’s the soft source in front of you that keeps you lit nicely. Everything after that is just icing on the cake.

This moodier shot has the much the same set-up as the image at the top of this post, but with a brighter light in the background and a dimmer light in the foreground.
A Cinematographer’s Guide to Looking Good on a Webcam

“A Cliché for the End of the World”

Photo: Catherine Ashenden

In August 2019 Jonnie Howard, director of The Knowledge, approached me about shooting an unusual short film with him. A Cliché for the End of the World is only two minutes long, but Jonnie wanted to shoot it as two unbroken takes which would be presented side by side. Each take would follow one character, starting and ending with them next to each other, but separating in the middle.

My first thought was that the two takes would have to be shot concurrently, but to squeeze two cameras into the small location and keep each out of the other’s frame would have been impossible. Instead, we settled on shooting with a single camera. After capturing 18 takes of the first side, Jonnie reviewed the footage with his editor Kat and selected one to use. We then shot the other side, with Kat calling out cues that would keep the actors in sync with the selected “master” take. (It took 18 takes to get this side in the can as well, partly because of getting the cues right and partly because of the difficulties Steadicam op Luke Oliver had in manoeuvring up the narrow staircase.)

The film had to be lit in a way that worked for both sides, with the camera starting in the living room looking towards the kitchen, moving up the stairs, through the landing and into the bedroom.

The HMI skips off the floor (left); Jeremy creates the dynamic look of TV light (right)

Working as usual to the general principle of lighting from the back, I set up a 2.5K HMI outside the kitchen window to punch a shaft of sunlight into the room. I angled this steeply so that it would not reach the actors directly, but instead bounce off the floor and light them indirectly. (See my article on lighting through windows.)

Gaffer Jeremy Dawson blacked out the living room windows to keep the foreground dark. He used an LED panel set to 6,600K (versus our camera’s white balance of 5,600K) to simulate an off-screen TV, waving a piece of black wrap in front of it to create dynamics.

The HMI outside (left); the diffused Dedo in the loft (right)

Next we needed to bring up the light levels for the actor’s journey up the stairs, which were naturally darker. Jeremy and spark Gareth Neal opened the loft hatch on the landing and rigged an LED Dedo inside, aimed at the darkest part of the staircase. They diffused this with some kind of net curtain I think.

To brighten the landing we set up a diffused 2×4 Kino Flo in the spare room and partially closed the door to give the light some shape. Both this and the loft Dedo were a couple of stops under key so as not to look too artificial.

Luke Oliver balances Jonnie’s C200 on his Steadicam rig.

All that remained was the bedroom. The characters were to end up sitting on the bed facing the window. Originally the camera in both takes was to finish facing them, with the window behind it, but this would have meant shadowing the actors, not to mention that space between the bed and the window was very limited. After some discussion between me, Jonnie, Luke, the cast, and production designer Amanda Stekly, we ended up moving the bed so that the camera could shoot the actors from behind, looking towards the window. This of course made for much more interesting and dimensional lighting.

The window looked out onto the street, and with a narrow pavement and no permission from the council, rigging a light outside was out of the question. Furthermore, we knew that the sun was going to shine right into that window later in the day, seriously messing with our continuity. Unfortunately all we could do was ask Amanda to dress in a net curtain. This took the worst of the harshness out of any direct sun and hopefully disguised the natural changes in light throughout the day at least a little.

When the sun did blast in through the window at about 6pm, we added a layer of unbleached muslin behind the net curtain to soften it further. We doubled this as the angle of the sun got more straight-on, then removed it entirely when the sun vanished behind the rooftops opposite at 7pm. About 20 minutes later we rigged a daylight LED panel in the room, bouncing off the ceiling, as a fill to counteract the diminishing natural light. We wrapped just as it was becoming impossible to match to earlier takes.

We were shooting in RAW on a Canon C200, which should give some grading latitude to help match takes from different times of day. The split-screen nature of the film means that the match needs to be very close though!

As I write this, the film is still in postproduction, and I very much look forward to seeing how it comes out. I’ll leave you with the start and end frames from slate 2, take 17, with a very quick and dirty grade.

“A Cliché for the End of the World”

The Hardest Shot I’ve Ever Done

It is night. We Steadicam into a moonlit bedroom, drifting across a window – where a raven is visible on the outside ledge, tapping at the glass with its beak – and land on a sleeping couple. The woman, Annabel, wakes up and goes to the window, causing the bird to flee. Crossing over to her far shoulder, we rest on Annabel’s reflection for a moment, before racking focus to another woman outside, maybe 200ft away, running towards a cliff. All in one shot.

Such was the action required in a scene from Annabel Lee, the most ambitious short I’ve ever been involved with. Based on Edgar Allen Poe’s poem, the film was the brainchild of actor Angel Parker, who plays the titular character. It was directed by Amy Coop, who had already to decided to shoot on an Alexa Mini with Cooke Anamorphics before I was even hired.

Working with animals has its own difficulties, but for me as director of photography the challenges of this particular shot were:

  1. Making the bedroom appear moonlit by the single window, without any lamps being visible at any point in the Steadicam move.
  2. Lighting the view outside.
  3. Ensuring the live raven read on camera even though the shot was quite wide.
  4. Making Annabel bright enough that her reflection would read, without washing out the rest of the scene.
  5. Blocking the camera in concert with Annabel’s move so that its reflection would not be seen.

I left that last one in the capable hands of Steadicam op Rupert Peddle, along with Angel and Amy. What they ended up doing was timing Angel’s move so that she would block the window from camera at the moment that the camera’s reflection would have appeared.

Meanwhile, I put my head together with gaffer Bertil Mulvad to tackle the other four challenges. We arrived at a set-up using only three lights:

  1. A LiteMat 1 above the window (indoors) which served to light Annabel and her reflection, as well as reaching to the bed.
  2. Another LED source outside the window to one side, lighting the raven.
  3. A nine-light Maxibrute on a cherry-picker, side-lighting the woman outside and the cliffs. This was gelled with CTB to match the daylight LEDs.

Unfortunately the outside LED panel backlit the window glass, which was old and kept fogging up, obscuring the raven. With hindsight that panel might have been better on the other side of the window (left rather than right, but still outside), even though it would have created some spill problems inside. (To be honest, this would have made the lighting direction more consistent with the Maxibrute “moonlight” as well. It’s so easy to see this stuff after the fact!)

Everything else worked very well, but editor Jim Page did have to cut in a close-up of the raven, without which you’d never have known it was there.

The Hardest Shot I’ve Ever Done

Exposure Part 4: ISO

So far in this series we have seen how we can adjust exposure using aperture, which affects depth of field, ND filters, which can help us retain the depth of field we want, and shutter angle, which affects motion blur and flickering of certain light sources. In this final part we’ll look at ISO, perhaps the most misunderstood element of exposure, if indeed we can technically classify it as part of exposure at all!

 

What is ISO?

The acronym stands for International Organization for Standardization, the body which in 1974 combined the old ASA (American Standards Association) units of film speed with the German DIN standard. That’s why you’ll often hear the terms ISO and ASA used interchangeably.

Two different cameras filming the same scene with the same filters, aperture and shutter settings will not necessarily produce an image of equal brightness, because the ways that their electronics convert light into video signals are different. That is why we need ISO, which defines the relationship between the amount of light reaching the sensor (or film) and the brightness of the resulting image.

For example, a common ISO to shoot at today is 800. One way of defining ISO 800 is that it’s the setting required to correctly expose a key-light of 12 foot-candles with a lens set to T2.8 and a 180° shutter at 24fps (1/48th of a second).

If we double the ISO we double the effective sensitivity of the camera, or halve the amount of light it requires. So at ISO 1600 we would only need 6 foot-candles of light (all the other settings being the same), and at ISO 3200 we would need just 3 foot-candles. Conversely, at ISO 400 we would need about 25 foot-candles, or 50 at ISO 200.

 

A Flawed Analogy

Note that I said “effective” sensitivity. This is an important point. In the photochemical world, ISO indeed denotes the light sensitivity of the film stock. It is tempting to see digital ISO as representing the sensitivity of the sensor, and changing the ISO as analogous to loading a different film stock. But in reality the sensitivity of a digital sensor is fixed, and the ISO only determines the amount of gain applied to the sensor data before it is processed (which may happen in camera if you’re shooting linear or log, or in post if you’re shooting RAW).

So a better analogy is that altering the ISO is like altering how long the lab develops the exposed film negative for. This alters the film’s exposure index (EI), hence some digital cameras using the term EI in their menus instead of ISO or ASA.

We can take this analogy further. Film manufacturers specify a recommended development time, an arbitrary period designed to produce the optimal image. If you increase (push) or decrease (pull) the development time you will get a lighter or darker image respectively, but the quality of the image will be reduced in various ways. Similarly, digital camera manufacturers specify a native ISO, which is essentially the recommended amount of gain applied to the sensor data to produce what the manufacturer feels is the best image, and if you move away from that native ISO you’ll get a subjectively “lower quality” image.

Compare the graininess/smoothness of the blacks in these images from my 2017 tests. Click to enlarge.

The most obvious side effect of increasing the ISO is more noticeable noise in the image. It’s exactly the same as turning up the volume on an amplifier; you hear more hiss because the noise floor is being boosted along with the signal itself.

I remember the days of Mini-DV cameras, which instead of ISO had gain; my Canon XL1 had gain settings of -3dB, +6dB and +12dB. It was the exact same thing, just with a different name. What the XL1 called 0dB of gain was what we call the native ISO today.

 

ISO and Dynamic range

At this point we need to bring in the concept of dynamic range. Let’s take the Arri Alexa as an example. This camera has a dynamic range of 14 stops. At its native ISO of 800, those 14 stops of dynamic range are equally distributed above and below “correct” exposure (known as middle grey), so you can overexpose by up to seven stops, and underexpose by up to seven stops, without losing detail.

If you change the Alexa’s ISO, those limits of under- and overexposure still apply, but they’re shifted around middle grey. For example, at 400 ISO you have eight stops of detail below middle grey, but only six above it. This means that, assuming you adjust your iris, shutter or filters to compensate for the change in ISO, you can trade-off highlight detail for shadow detail, or vice versa.

Imagine underexposing a shot by one stop and bringing it back up in post. You increase the highlight detail, because you’re letting half the light through to the sensor, reducing the risk of clipped whites, but you also increase the noise when you bring it up in post. This is basically what you’re doing when you increase your ISO, except that if you’re recording in linear or log then the restoration of brightness and increase in gain happen within the camera, rather than in post with RAW.

Note the increased detail in the bulb at higher ISOs. Click to enlarge..

We can summarise all this as follows:

Doubling the ISO…

  • increases overall brightness by one stop, and
  • increases picture noise.

Then adjusting the exposure to compensate (e.g. closing the iris one stop)…

  • restores overall brightness to its original value,
  • gives you one more stop of detail in the highlights, and
  • gives you one less stop of detail in the shadows.

Alternatively, halving the ISO…

  • decreases overall brightness by one stop, and
  • decreases picture noise.

Then adjusting the exposure to compensate (e.g. opening the iris one stop)…

  • restores overall brightness to its original value,
  • gives you one less stop of detail in the highlights, and
  • gives you one more stop of detail in the shadows.

 

Conclusion

This brings me to the end of my exposure series. We’ve seen that choosing the “correct” exposure is a balancing act, taking into account not just the intended brightness of the image but also the desired depth of field, bokeh, lens flares, motion blur, flicker prevention, noise and dynamic range. I hope this series has helped you to make the best creative decisions on your next production.

See also: “6 Ways to Judge Exposure”

Exposure Part 4: ISO

Exposure Part 3: Shutter

In the first two parts of this series we saw how exposure can be controlled using the lens aperture – with side effects including changes to the depth of field – and neutral density (ND) filters. Today we will look at another means of exposure control: shutter angle.

 

The Physical Shutters of Film Cameras

As with aperture, an understanding of what’s going on under the hood is useful, and that begins with celluloid. Let’s imagine we’re shooting on film at 24fps, the most common frame rate. The film can’t move continuously through the gate (the opening behind the lens where the focused light strikes the film) or we would end up recording just a long vertical streak of light. The film must remain stationary long enough to expose an image, before being moved on by a distance of four perforations (the standard height of a 35mm film frame) so that the next frame can be exposed. Crucially, light must not hit the film while it is being moved, or vertical streaking will occur.

Joram van Hartingsveldt, CC BY-SA 3.0

This is where the shutter comes in. The shutter is a portion of a disc that spins in front of the gate. The standard shutter angle is 180°, meaning that the shutter is a semi-circle. We always describe shutter angles by the portion of the disc which is missing, so a 270° shutter (admitting 1.5x the light of a 180° shutter) is a quarter of a circle, and a 90° shutter (admitting half the light of a 180° shutter) is three-quarters.

The shutter spins continuously at the same speed as the frame rate – so at 24fps the shutter makes 24 revolutions per second. So with a 180° shutter, each 24th of a second is divided into two halves, i.e. 48ths of a second:

  • During one 48th of a second, the missing part of the shutter is over the gate, allowing the light to pass through and the stationary film to be exposed.
  • During the other 48th of a second, the shutter blocks the gate to prevent light hitting the film as it is advanced. The shutter has a mirrored surface so that light from the lens is reflected up the viewfinder, allowing the camera operator to see what they’re shooting.

 

Intervals vs. Angles

If you come from a stills or ENG background, you may be more used to talking about shutter intervals rather than angles. The two things are related as follows:

Frame rate x (360 ÷ shutter angle) = shutter interval denominator

For example, 24 x (360 ÷ 180) = 48 so a film running at 24fps, shot with a 180° shutter, shows us only a 48th of a second’s worth of light on each frame. This has been the standard frame rate and shutter angle in cinema since the introduction of sound in the late 1920s. The amount of motion blur captured in a 48th of a second is the amount that we as an audience have been trained to expect from motion pictures all our lives.

A greater (larger shutter angle, longer shutter interval) or lesser (smaller shutter angle, shorter shutter interval) amount of motion blur looks unusual to us and thus can be used to creative effect. Saving Private Ryan features one of the best-known examples of a small shutter angle in its D-day landing sequence, where the lack of motion blur creates a crisp, hyper-real effect that draws you into the horror of the battle. The effect has been endlessly copied since then, to the point that it now feels almost mandatory to shoot action scenes with a small shutter angle.

Large shutter angles are less common, but the extra motion blur can imply a drugged, fatigued or dream-like state.

In today’s digital environment, only the Arri Alexa Studio has a physical shutter. In other cameras, the sensor’s photo-sites are allowed to charge with light over a certain period of time – still referred to as the shutter interval, even though no actual shutter is involved. The same principles apply and the same 180° angle of the virtual shutter is standard. The camera will allow you to select a shutter angle/interval from a number of options, and on some models like the Canon C300 there is a menu setting to switch between displaying the shutter setting as an angle or an interval.

 

When to Change the Shutter Angle

Sometimes it is necessary to change the shutter angle to avoid flickering. Some luminous devices, such as TV screens and monitors, or HMI lighting not set to flicker-free mode, will appear to strobe, pulse or roll on camera. This is due to them turning on and off multiple times per second, in sync with the alternating current of the mains power supply, but not necessarily in sync with the shutter. For example, if you shoot a domestic fluorescent lamp in the UK, where the mains AC cycles at 50Hz, your 1/48th (180° at 24fps) shutter will be out of sync and the lamp will appear to throb or flicker on camera. The solution is to set the shutter to 172.8° (1/50th), which is indeed what most DPs do when shooting features in the UK. Round multiples of the AC frequency like 1/100th will also work.

You may notice that I have barely mentioned exposure so far in this article. This is because, unlike stills photographers, DPs rarely use the shutter as a means of adjusting exposure. An exception is that we may increase the shutter angle when the daylight is fading, to grab an extra shot. By doubling the shutter angle from 172.8° to 345.6° we double the light admitted, i.e. we gain one stop. As long as there isn’t any fast movement, the extra motion blur is likely to go unnoticed by the audience.

One of the hallmarks of amateur cinematography is that sunny scenes have no motion blur, due to the operator (or the camera’s auto mode) decreasing the shutter interval to avoid over-exposure. It is preferable to use ND filters to cut light on bright days, as covered in part two of this series.

For the best results, the 180° (or thereabouts) shutter angle should be retained when shooting slow motion as well. If your camera displays intervals rather than angles, ideally your interval denominator should be double the frame rate. So if you want to shoot at 50fps, set the shutter interval to 1/100th. For 100fps, set the shutter to 1/200th, and so on.

If you do need to change the shutter angle for creative or technical reasons, you will usually want to compensate with the aperture. If you halve the time the shutter is open for, you must double the area of the aperture to maintain the same exposure, and vice versa. For example, if your iris was set to T4 and you change the shutter from 180° to 90° you will need to stop up to T2.8. (Refer back to my article on aperture if you need to refresh your memory about T-stops.)

In the final part of this series we’ll get to grips with ISO.

Learn more about exposure in my online course, Cinematic Lighting. Until this Thursday (19/11/20) you can get it for the special price of £15.99 by using the voucher code INSTA90.

Exposure Part 3: Shutter

Exposure Part 2: Neutral Density (ND) Filters

In the first part of this series, I explained the concepts of f-stops and T-stops, and looked at how aperture can be used to control exposure. We saw that changing the aperture causes side effects, most noticeably altering the depth of field.

How can we set the correct exposure without compromising our depth of field? Well, as we’ll see later in this series, we can adjust the shutter angle and/or ISO, but both of those have their own side effects. More commonly a DP will use neutral density (ND) filters to control the amount of light reaching the lens. These filters get their name from the fact that they block all wavelengths of light equally, so they darken the image without affecting the colour.

 

When to use an ND Filter

Let’s look at an example. Imagine that I want to shoot at T4; this aperture gives a nice depth of field, on the shallow side but not excessively so. My subject is very close to a bright window and my incident light meter is giving me a reading of f/11. (Although I’m aiming for a T-stop rather an f-stop, I can still use the f-number my meter gives me; in fact if my lens were marked in f-stops then my exposure would be slightly off because the meter does not know the transmission efficiency of my lens.) Let’s remind ourselves of the f-stop/T-stop series before we go any further:

1      1.4      2      2.8      4      5.6      8      11      16      22     32

By looking at this series, which can be found printed on any lens barrel or permanently displayed on a light meter’s screen, I can see that f/11 (or T11) is three stops down from f/4 (or T4) – because 11 is three numbers to the right of 4 in the series. To achieve correct exposure at T4 I’ll need to cut three stops of light. I can often be seen on set counting the stops like this on my light meter or on my fingers. It is of course possible to work it out mathematically or with an app, but that’s not usually necessary. You quickly memorise the series of stops with practice.

 

What Strength of filter to choose

Some ND filters are marked in stops, so I could simply select a 3-stop ND and slide it into my matte box or screw it onto my lens. Other times – the built-in ND filters on the Sony FS7, for example – they’re defined by the fraction of light they let through. So the FS7’s 1/4 ND cuts two stops; the first stop halves the light – as we saw in part of one of this series – and the second stop halves it again, leaving us a quarter of the original amount. The 1/16 setting cuts four stops.

However, most commonly, ND filters are labelled in optical density. A popular range of ND filters amongst professional cinematographers are those made by Tiffen, and a typical set might be labelled as follows:

.3      .6      .9      1.2

That’s the optical density, a property defined as the natural logarithm of the ratio of the quantity of light entering the filter to the quantity of light exiting it on the other side. A .3 ND reduces the light by half because 10 raised to the power of -0.3 is about 0.5, and reducing light by half, as we’ve previously established, means dropping one stop.

If that maths is a bit much for you, don’t worry. All you really need to do is multiply the number of stops you want to cut by 0.3 to find the filter you need. So, going back to my example with the bright window, to get from T11 to T4, i.e. to cut three stops, I’ll pick the .9 ND.

It’s far from intuitive at first, but once you get your head around it, and memorise the f-stops, it’s not too difficult. Trust me!

Here are a couple more examples:

  • Light meter reads f/8 and you want to shoot at T5.6. That’s a one stop difference. (5.6 and 8 are right next to each other in the stop series, as you’ll see if you scroll back to the top.) 1 x 0.3 = 0.3 so you should use the .3 ND.
  • Light meter reads f/22 and you want to shoot at T2.8. That’s a six stop difference (scroll back up and count them), and 6 x 0.3 = 1.8, so you need a 1.8 ND filter. If you don’t have one, you need to stack two NDs in your matte box that add up to 1.8, e.g. a 1.2 and a .6.

 

Variations on a Theme

Variable ND filters are also available. These consist of two polarising filters which can be rotated against each other to progressively lighten or darken the image. They’re great for shooting guerilla-style with a small crew. You can set your iris where you want it for depth of field, then expose the image by eye simply by turning the filter. On the down side, they’re hard to use with a light meter because there is often little correspondence between the markings on the filter and stops. They can also have a subtle adverse effect on skin tones, draining a person’s apparent vitality, as some of the light which reflects off human skin is polarised.

IR pollution increases with successively stronger ND filters (left to right) used on a Blackmagic Micro Cinema Camera. The blue dyes in this costume evidently reflect a large amount of IR.

Another issue to look out for with ND filters is infra-red (IR). Some filters cut only the visible wavelengths of light, allowing IR to pass through. Some digital sensors will interpret this IR as visible red, resulting in an image with a red colour cast which can be hard to grade out because different materials will be affected to different degrees. Special IR ND filters are available to eliminate this problem.

These caveats aside, ND filters are the best way to adjust exposure (downwards at least) without affecting the image in any other way.

In the next part of this series I’ll look at shutter angles, what they mean, how they affect exposure and what the side effects are.

Learn how to use ND filters practically with my Cinematic Lighting online couse. Enter voucher code INSTA90 for an amazing 90% off.

Exposure Part 2: Neutral Density (ND) Filters

Exposure Part 1: Aperture

This is the first in a series of posts where I will look in detail at the four means of controlling the brightness of a digital video image: aperture, neutral density (ND) filters, shutter angle and ISO. It is not uncommon for newer cinematographers to have only a partial understanding of these topics, enough to get by in most situations; that was certainly the case with me for many years. The aim of this series is to give you an understanding of the underlying mechanics which will enable you to make more informed creative decisions.

You can change any one of the four factors, or any combination of them, to reach your desired level of exposure. However, most of them will also affect the image in other ways; for example, aperture affects depth of field. One of the key responsibilities of the director of photography is to use each of the four factors not just to create the ideal exposure, but to make appropriate use of these “side effects” as well.

 

f-stops and t-stops

The most common way of altering exposure is to adjust the aperture, a.k.a. the iris, sometimes described as changing “the stop”. Just like the pupil in our eyes, the aperture of a photographic lens is a (roughly) circular opening which can be expanded or contracted to permit more or less light through to the sensor.

You will have seen a series of numbers like this printed on the sides of lenses:

1      1.4      2      2.8      4      5.6      8      11      16      22     32

These are ratios – ratios of the lens’ focal length to its iris diameter. So a 50mm lens with a 25mm diameter iris is at f/2. Other lengths of lens would have different iris diameters at f/2 (e.g. 10mm diameter for a 20mm lens) but they would all produce an image of the same brightness. That’s why we use f-stops to talk about iris rather than diameters.

But why not label a lens 1, 2, 3, 4…? Why 1, 1.2, 2, 2.8…? These magic numbers are f-stops. A lens set to f/1.4 will let in twice as much light as (or “one stop more than”) a lens set to f/2, which in turn will let in twice as much as one set to f/2.8, and so on. Conversely, a lens set to f/2.8 will let in half as much light as (or “one stop less than”) a lens set to f/2, and so on. (Note that a number between any of these f-stops, e.g. f/1.8, is properly called an f-number, but not an f-stop.) These doublings or halvings – technically known as a base-2 logarithmic scale – are a fundamental concept in exposure, and mimic our eyes’ response to light.

If you think back to high-school maths and the πr² squared formula for calculating the area of a circle from its radius, the reason for the seemingly random series of numbers will start to become clear. Letting in twice as much light requires twice as much area for those light rays to fall on, and remember that the f-number is the ratio of the focal length to the iris diameter, so you can see how square roots are going to get involved and why f-stops aren’t just plain old round numbers.

If you’re shooting with a cine lens, rather than a stills lens, you’ll see the same series of numbers on the barrel, but here they are T-stops rather than f-stops. T-stops are f-stops adjusted to compensate for the light transmission efficiency. Two different lenses set to, say, f/2 will not necessarily produce equally bright images, because some percentage of light travelling through the elements will always be lost, and that percentage will vary depending on the quality of the glass and the number of elements. A lens with 100% light transmission would have the same f-number and T-number, but in practice the T-number will always be a little bigger than the f-number. For example, Cooke’s 15-40mm zoom is rated at a maximum aperture of T2 or f/1.84.

 

Fast and slow lenses

When buying or renting a lens, one of the first things you will want to know is its maximum aperture. Lenses are often described as being fast (larger maximum aperture, denoted by a smaller f- or T-number like T1.4) or slow (smaller maximum aperture, denoted by a bigger f- or T-number like T4). These terms come from the fact that the shutter speed would need to be faster or slower to capture the same amount of light… but more on that later in the series.

Faster lenses are generally more expensive, but that expense may well be outweighed by the savings made on lighting equipment. Let’s take a simple example, and imagine an interview lit by a 4-bank Kino Flo and exposed at T2.8. If our lens can open one stop wider (known as stopping up) to T2 then we double the amount of light reaching the sensor. We can therefore halve the level of light – by turning off two of the Kino Flo’s tubes or by renting a cheaper 2-bank unit in the first place. If we can stop up further, to T1.4, then we only need one Kino tube to achieve the same exposure.

 

Side effects

One of the first things that budding cinematographers learn is that wider apertures make for a smaller depth of field, i.e. the range of distances within which a subject will be in focus is smaller. In simple terms, the background of the image is blurrier when the depth of field is shallower.

It is often tempting to go for the shallowest possible depth of field, because it feels more cinematic and helps conceal shortcomings in the production design, but that is not the right look for every story. A DP will often choose a stop to shoot at based on the depth of field they desire. That choice of stop may affect the entire lighting budget; if you want to shoot at a very slow T14 like Douglas Slocombe did for the Indiana Jones trilogy, you’re going to need several trucks full of lights!

There is another side effect of adjusting the aperture which is less obvious. Lenses are manufactured to perform best in the middle of their iris range. If you open a lens up to its maximum aperture or close it down to its minimum, the image will soften a little. Therefore another advantage of faster lenses is the ability to get further away from their maximum aperture (and poorest image quality) with the same amount of light.

Finally it is worth noting that the appearance of bokeh (out of focus areas) and lens flares also changes with aperture. The Cooke S4 range, for example, renders out-of-focus highlights as circles when wide open, but as octagons when stopped down. With all lenses, the star pattern seen around bright light sources will be stronger when the aperture is smaller. You should shoot tests – like these I conducted in 2017 – if these image artefacts are a critical part of your film’s look.

Next time we’ll look at how we can use ND filters to control exposure without compromising our choice of stop.

Learn how to use exposure practically with my Cinematic Lighting online couse. Enter voucher code INSTA90 for an amazing 90% off.

Exposure Part 1: Aperture

The First Light a Cinematographer Should Put Up

Where do you start, as a director of photography lighting a set? What should be the first brushstroke when you’re painting with light?

I believe the answer is backlight, and I think many DPs would agree with me.

Let’s take the example of a night exterior in a historical fantasy piece, as featured in my online course, Cinematic Lighting. The main source of light in such a scene would be the moon. Where am I going to put it? At the back.

The before image is lit by an LED panel serving purely as a work-light while we rehearsed. It’s not directly above the camera, but off to the right, so the lighting isn’t completely flat, but there is very little depth in the image. Beyond the gate is a boring black void.

The after image completely transforms the viewer’s understanding of the three-dimensional space. We get the sense of a world beyond the gate, an intriguing world lighter than the foreground, with a glimpse of trees and space. Composing the brazier in the foreground has added a further plane, again increasing the three-dimensional impression.

Here is the lighting diagram for the scene. (Loads more diagrams like this can be seen on my Instagram feed.)

The “moon” is a 2.5KW HMI fresnel way back amongst the trees, hidden from camera by the wall on the right. This throws the gate and the characters into silhouette, creating a rim of light around their camera-right sides.

To shed a little light on Ivan’s face as he looks camera-left, I hid a 4×4′ Kino Flo behind the lefthand wall, again behind the actors.

The LED from the rehearsal, a Neewer 480, hasn’t moved, but now it has an orange gel and is dimmed very low to subtly enhance the firelight. Note how the contrasting colours in the frame add to the depth as well.

So I’ll always go into a scene looking at where to put a big backlight, and then seeing if I need any additional sources. Sometimes I don’t, like in this scene from the Daylight Interior module of the course.

Backlight for interior scenes is different to night interiors. You cannot simply put it where you want it. You must work with the position of the windows. When I’m prepping interiors, I always work with the director to try to block the scene so that we can face towards the window as much as possible, making it our backlight. If a set is being built, I’ll talk to the production designer at the design stage to get windows put in to backlight the main camera positions whenever possible.

In the above example, lit by just the 2.5K HMI outside the window, I actually blacked out windows behind camera so that they would not fill in the nice shadows created by the backlight.

Daylight exteriors are different again. I never use artificial lights outdoors in daytime any more. I prefer to work with the natural light and employ reflectors, diffusion or negative fill to mould it where necessary.

So it’s very important to block the scene with the camera facing the sun whenever possible. Predicting the sun path may take a little work, but it will always be worth it.

Here I’ve shot south, towards the low November sun, and didn’t need to modify the light at all.

Shooting in the opposite direction would have looked flat and uninteresting, not to mention causing potential problems with the cast squinting in the sunlight, and boom and camera shadows being cast on them.

You can learn much more about the principles and practice of cinematic lighting by taking my online course on Udemy. Currently you can get an amazing 90% off using the voucher code INSTA90 until November 19th.

For more examples of building a scene around backlight, see my article “Lighting from the Back”.

The First Light a Cinematographer Should Put Up