“The Little Mermaid”: Lighting from the Back

So far, this blog series about my cinematography of The Little Mermaid has covered the biggest and most complex scenes in the movie. Today I’m going to look at some smaller scenes, and how I employed the cinematography tenet of lighting from the back to quickly build a look for these which has depth, mood and drama.

Many of these examples are specifically cross-backlighting, something I covered in my Lighting Techniques series, but I’ll quickly recap since it has so much relevance here. It involves lighting two characters facing each other with two sources, on the far side of the eye-line (short key), crossed so that each source keys one character and often backlights the other too.

So with that in mind, let’s proceed to the examples from my shooting diary.

 

Day 1

The first week is pretty much all in houses with just a few principals, so an easy start. Day 1’s schedule is tight though. We start in a third floor bedroom – no way lamps are getting up to those windows from outside, so I’m relying on natural light augmented with a bit of cross-backlight cheated inside the room. (There’s a Kino Flo shining at Elle over Cam’s right shoulder, for example.) Once the haze is in it looks great. After we get the main coverage, we head out to the garden for the next scene, while the ‘B’ camera team steps in to pick up a couple of inserts…

 

Day 3

…It’s a night scene and the grips have tented the window. To get a nice blue glow coming in, I have two 4×4 Kino Flos set either side of the window (outside), and they give a great wrapping backlight to the actors and the set dressing. Smoke and a cool white balance of 3,200K (the Kinos are tubed for 5,600K) complete the look. It owes a lot to a scene from Hook, one of Blake’s (director Blake Harris) reference movies which I watched during preprod. This stuff definitely filters in and inspires things!

 

Day 13

Our first day on stage. It’s weird to be back at the former supermarket I spent five weeks of preproduction in. The first set, Locke’s chamber, is very confined and the walls don’t wild, so it’s quite slow-going to work in there. We fire a 5K fresnel through the stained glass window at the back of the set. Then I fall back on the tried and tested method of cross-backlighting even though I know that it will be hard to hide the lamps (a 650W fresnel in both of the upper rear corners of the set) from camera. In the end I have the art department dress drapes in front of them. For the villain’s single I leave the light hard, but for the hero’s single we use bounce boards to wrap the light around his face more…

 

Day 28

We start with the fortune-teller’s tent, another small set constructed on stage. In fact, it’s just an Easy-Up artfully draped with fabrics. Initially there’s nowhere to get light in from except the front, but I know that this will leave the scene looking flat and fake, so I work with the art department again to make holes in the top rear corners. Through those we shine tungsten-bubbled “Fat Boy” Kino Flos. (These 2ft 4-bank units are giving the dual kickers on Cam in the centre, and the beautiful down-light on the background fabrics, bringing out the ruching. Each one also provides a little key-light on the two ladies.) The other sources are “moonlight” coming in through the entrance, linking us to the circus exteriors, and a stylised slash of light across Thora’s eyes from a Source Four, suggested by Jason (key grip Jason Batey). Adding foreground practicals is an important final touch to expand the depth and scale of the set…

 

Day 31

It’s the last day of principal photography. Our big scene of the day is the newspaper office where Cam works, which is a set in the front of the studio, using the building’s real windows. We fire the 12K in and gel it with half CTS for a nice morning sunlight effect. We’re shooting towards the windows, which have blinds, so we get some nice shafts of light, though sometimes it’s a little too smokey. Running haze is a pretty skilled and tricky job, and involves considering the lens length and backlight, which both affect how much the smoke shows up on camera. When we get it right, combined with the dark wood period furniture, it totally sells the 1937 setting. Apparently people at video village are loving it, saying it looks like Mad Men….

Next week, in the final part of my blog series on The Little Mermaid, I’ll share my experiences of shooting the sunset denouement while up to my waist in the Atlantic Ocean.

“The Little Mermaid”: Lighting from the Back

“The Little Mermaid”: Pools of Light

Although The Little Mermaid takes place mostly on dry land, there were some key scenes involving tanks and pools. These include the moment which introduces the audience to the mermaid herself, played by Poppy Drayton. Here are some extracts from my diary covering the challenges of creating a magical, fairytale look while filming in and around water.

 

Day 10

Today we’re inside the big top all day – actually all NIGHT. We can’t shoot during the day because too much daylight bleeds through the canvas of the tent.

We are setting up when a storm hits. The tent starts to blow about in a slightly alarming fashion, rain lashes down outside (and inside, because the tent isn’t very waterproof) and lightning flashes. We are ordered out of the tent, and I run into a waiting mini-van with Joe from art and some of the camera crew. We sit watching the rain and telling stories for half an hour before we can press on.

Setting up with a stand-in next to the mermaid tank (centre, behind the monitors). In the top right you can see the 575W HMI backlight for the tank, and below that, grip Sawyer Oubre stands ready to fake watery rippling light with a par can and a blue gel frame.

Around the wall of the tent the art department have hung canvas posters; at the suggestion of gaffer Mike Horton, we uplight these with par cans and par 38s. The design of these fixtures hasn’t changed since the 30s, so we can get away with seeing them in shot. The art dept have sourced four period spotlights which we use as background interest (they’re not powerful enough to really illuminate anything), as well as string-lights.

Ambience comes from a Maxi Brute, with just a couple of bubbles on, firing into the tent roof. After seeing a video test of various diffusers during preproduction, I asked for Moroccan Frost to be added to our consumables list, and we use it for the first time on this Maxi Brute. It gives a lovely muted orangey-pink look to the scene.

Steadicam operator Chris Lymberis. Photo: Kane Pearson

We’re shooting our mermaid for the very first time, in a tank in the circus ring. The initial plan is to fire a Source Four straight down into the water to create genuine watery rippling light, while bouncing a par can off a wobbling frame of blue gel to beef up the effect. In the end the Source Four isn’t really cutting it, so instead we rig a 575W HMI, gelled with Steel Blue, to a menace arm and fire it into the tank as toppy backlight. This Steel Blue gelled daylight source, blued up slightly further by the water itself, contrasts beautifully with the Moroccan Frost tungsten ambience which the Maxi Brutes are giving us.

In her mermaid tail and costume, Poppy Drayton looks stunning in the tank. We shoot steadicam angles and some slo-mo to get the most out of the set-up.

 

Day 15

The rocky pool set with two of the side-lighting Kino Flos and the 1.2K HMI backlight (centre) in place

Back on stage, and we’re shooting the rocky pool. This set was built before I even arrived in Savannah, so I’ve been waiting a long time to shoot it. It’s built almost right up to the ceiling of the studio (a former supermarket) so it’s challenging to light. The grips build four menace arms and poke two 4×4 Kinos and two 575W HMIs over the sides to cross-light the set and bring out all the texture in it. Where the set ends they put up a 20×20′ greenscreen, which we light with two Kino Flo Image 80s fitted with special chroma green tubes.

After a wide (which didn’t make the final cut), the next set-up is a 2-shot of our leads in the pool itself. We consider arming the camera out over the pool using a jib, but ultimately decide that it’s better for me to join the cast in the pool, with the camera on my shoulder in a splash bag. 2nd AC Kane Pearson joins the pool party as well, and ends up hand-bashing a monitor for me since the splash bag’s designed for a Panaflex film camera and the viewfinder doesn’t line up. I’m reminded of my frustrating splash bag experience on See Saw back in 2007, but this time at least within a few minutes I’ve found a comfortable and effective way to operate the camera, under-slinging it and allowing it to partially float so I don’t have to support the whole weight.

For this shot we’ve added our par-can-bounced-off-a-wobbling-blue-gel gag for watery light ripples, and combined with the real light ripples and the reflections of a 1.2K HMI backlight, the image looks beautiful.

 

Day 19

After lunch we shoot the singles for the rocky pool scene. The pool itself has been removed, and the actors sit on stools in a paddling pool, with the set behind them. The paddling pool serves two functions: it catches the water that make-up pours over the actors to make them look wet, and it reflects rippling light onto their faces. This light originates from a par can. At first it flattens out the look, then we figure out that we need to lay black fabric on the bottom of the pool. This stops the par can’s light bouncing directly, while retaining the rippling highlights off the water’s surface. (Check out my article on shooting water for more tips like this.)

The low-tech solution for the pool pick-ups

In the final edit this was all intercut with some beautiful footage by underwater DP Jordan Klein, shot both at a local diving pool in Savannah and at Weeki Wachee Springs State Park in Florida. The main unit shot another scene in the actual ocean, but I’ll cover that later in this series. In the meantime, next week I’ll reveal some of the tricks and techniques used in shooting The Little Mermaid‘s many sequences in moving vehicles.

“The Little Mermaid”: Pools of Light

Colour Rendering Index

Many light sources we come across today have a CRI rating. Most of us realise that the higher the number, the better the quality of light, but is it really that simple? What exactly is Colour Rendering Index, how is it measured and can we trust it as cinematographers? Let’s find out.

 

What is C.R.I.?

CRI was created in 1965 by the CIE – Commission Internationale de l’Eclairage – the same body responsible for the colour-space diagram we met in my post about How Colour Works. The CIE wanted to define a standard method of measuring and rating the colour-rendering properties of light sources, particularly those which don’t emit a full spectrum of light, like fluorescent tubes which were becoming popular in the sixties. The aim was to meet the needs of architects deciding what kind of lighting to install in factories, supermarkets and the like, with little or no thought given to cinematography.

As we saw in How Colour Works, colour is caused by the absorption of certain wavelengths of light by a surface, and the reflection of others. For this to work properly, the light shining on the surface in the first place needs to consist of all the visible wavelengths. The graphs below shows that daylight indeed consists of a full spectrum, as does incandescent lighting (e.g. tungsten), although its skew to the red end means that white-balancing is necessary to restore the correct proportions of colours to a photographed image. (See my article on Understanding Colour Temperature.)

Fluorescent and LED sources, however, have huge peaks and troughs in their spectral output, with some wavelengths missing completely. If the wavelengths aren’t there to begin with, they can’t reflect off the subject, so the colour of the subject will look wrong.

Analysing the spectrum of a light source to produce graphs like this required expensive equipment, so the CIE devised a simpler method of determining CRI, based on how the source reflected off a set of eight colour patches. These patches were murky pastel shades taken from the Munsell colour wheel (see my Colour Schemes post for more on colour wheels). In 2004, six more-saturated patches were added.

The maths which is used to arrive at a CRI value goes right over my head, but the testing process boils down to this:

  1. Illuminate a patch with daylight (if the source being tested has a correlated colour temperature of 5,000K or above) or incandescent light (if below 5,000K).
  2. Compare the colour of the patch to a colour-space CIE diagram and note the coordinates of the corresponding colour on the diagram.
  3. Now illuminate the patch with the source being tested.
  4. Compare the new colour of the patch to the CIE diagram and note the coordinates of the corresponding colour.
  5. Calculate the distance between the two coordinates, i.e. the difference in colour under the two light sources.
  6. Repeat with the remaining patches and calculate the average difference.

Here are a few CRI ratings gleaned from around the web:

Source CRI
Sodium streetlight -44
Standard fluorescent 50-75
Standard LED 83
LitePanels 1×1 LED 90
Arri HMI 90+
Kino Flo 95
Tungsten 100 (maximum)

 

Problems with C.R.I.

There have been many criticisms of the CRI system. One is that the use of mean averaging results in a lamp with mediocre performance across all the patches scoring the same CRI as a lamp that does terrible rendering of one colour but good rendering of all the others.

Demonstrating the non-continuous spectrum of a fluorescent lamp, versus the continuous spectrum of incandescent, using a prism.

Further criticisms relate to the colour patches themselves. The eight standard patches are low in saturation, making them easier to render accurately than bright colours. An unscrupulous manufacturer could design their lamp to render the test colours well without worrying about the rest of the spectrum.

In practice this all means that CRI ratings sometimes don’t correspond to the evidence of your own eyes. For example, I’d wager that an HMI with a quoted CRI in the low nineties is going to render more natural skin-tones than an LED panel with the same rating.

I prefer to assess the quality of a light source by eye rather than relying on any quoted CRI value. Holding my hand up in front of an LED fixture, I can quickly tell whether the skin tones looks right or not. Unfortunately even this system is flawed.

The fundamental issue is the trichromatic nature of our eyes and of cameras: both work out what colour things are based on sensory input of only red, green and blue. As an analogy, imagine a wall with a number of cracks in it. Imagine that you can only inspect it through an opaque barrier with three slits in it. Through those three slits, the wall may look completely unblemished. The cracks are there, but since they’re not aligned with the slits, you’re not aware of them. And the “slits” of the human eye are not in the same place as the slits of a camera’s sensor, i.e. the respective sensitivities of our long, medium and short cones do not quite match the red, green and blue dyes in the Bayer filters of cameras. Under continuous-spectrum lighting (“smooth wall”) this doesn’t matter, but with non-continuous-spectrum sources (“cracked wall”) it can lead to something looking right to the eye but not on camera, or vice-versa.

 

Conclusion

Given its age and its intended use, it’s not surprising that CRI is a pretty poor indicator of light quality for a modern DP or gaffer. Various alternative systems exist, including GAI (Gamut Area Index) and TLCI (Television Lighting Consistency Index), the latter similar to CRI but introducing a camera into the process rather than relying solely on human observation. The Academy of Motion Picture Arts and Sciences recently invented a system, Spectral Similarity Index (SSI), which involves measuring the source itself with a spectrometer, rather than reflected light. At the time of writing, however, we are still stuck with CRI as the dominant quantitative measure.

So what is the solution? Test, test, test. Take your chosen camera and lens system and shoot some footage with the fixtures in question. For the moment at least, that is the only way to really know what kind of light you’re getting.

SaveSave

SaveSaveSaveSave

SaveSave

SaveSave

SaveSave

SaveSave

SaveSave

SaveSave

SaveSave

SaveSave

Colour Rendering Index

Colour Schemes

Last week I looked at the science of colour: what it is, how our eyes see it, and how cameras see and process it. Now I’m going to look at colour theory – that is, schemes of mixing colours to produce aesthetically pleasing results.

 

The Colour wheel

The first colour wheel was drawn by Sir Isaac Newton in 1704, and it’s a precursor of the CIE diagram we met last week. It’s a method of arranging hues so that useful relationships between them – like primaries and secondaries, and the schemes we’ll cover below – can be understood. As we know from last week, colour is in reality a linear spectrum which we humans perceive by deducing it from the amounts of light triggering our red, green and blue cones, but certain quirks of our visual system make a wheel in many ways a more useful arrangement of the colours than a linear spectrum.

One of these quirks is that our long (red) cones, although having peak sensitivity to red light, have a smaller peak in sensitivity at the opposite (violet) end of the spectrum. This may be what causes our perception of colour to “wrap around”.

Another quirk is in the way that colour information is encoded in the retina before being piped along the optic nerve to the brain. Rather than producing red, green and blue signals, the retina compares the levels of red to green, and of blue to yellow (the sum of red and green cones), and sends these colour opponency channels along with a luminance channel to the brain.

You can test these opposites yourself by staring at a solid block of one of the colours for around 30 seconds and then looking at something white. The white will initially take on the opposing colour, so if you stared at red then you will see green.

Hering’s colour wheels

19th century physiologist Ewald Hering was the first to theorise about this colour opponency, and he designed his own colour wheel to match it, having red/green on the vertical axis and blue/yellow on the horizontal.

RGB colour wheel

Today we are more familiar with the RGB colour wheel, which spaces red, green and blue equally around the circle. But both wheels – the first dealing with colour perception in the eye-brain system, and the second dealing with colour representation on an RGB screen – are relevant to cinematography.

On both wheels, colours directly opposite each other are considered to cancel each other out. (In RGB they make white when combined.) These pairs are known as complementary colours.

 

Complementary

A complementary scheme provides maximum colour contrast, each of the two hues making the other more vibrant. Take “The Snail” by modernist French artist Henri Matisse, which you can currently see at the Tate Modern; Matisse placed complementary colours next to each other to make them all pop.

“The Snail” by Henri Matisse (1953)

In cinematography, a single pair of complementary colours is often used, for example the yellows and blues of Aliens‘ power loader scene:

“Aliens” DP: Adrian Biddle, BSC

Or this scene from Life on Mars which I covered on my YouTube show Lighting I Like:

I frequently use a blue/orange colour scheme, because it’s the natural result of mixing tungsten with cool daylight or “moonlight”.

“The First Musketeer”, DP: Neil Oseman

And then of course there’s the orange-and-teal grading so common in Hollywood:

“Hot Tub Time Machine” DP: Jack N. Green, ASC

Amélie uses a less common complementary pairing of red and green:

“Amélie” DP: Bruno Belbonnel, AFC, ASC

 

Analogous

An analogous colour scheme uses hues adjacent to each other on the wheel. It lacks the punch and vibrancy of a complementary scheme, instead having a harmonious, unifying effect. In the examples below it seems to enhance the single-mindedness of the characters. Sometimes filmmakers push analogous colours to the extreme of using literally just one hue, at which point it is technically monochrome.

“The Matrix” DP: Bill Pope, ASC
“Terminator 2: Judgment Day” DP: Adam Greenberg, ASC
“The Double” DP: Erik Alexander Wilson
“Total Recall” (1990) DP: Jost Vacano, ASC, BVK

 

There are other colour schemes, such as triadic, but complementary and analogous colours are by far the most common in cinematography. In a future post I’ll look at the psychological effects of individual colours and how they can be used to enhance the themes and emotions of a film.

SaveSave

Colour Schemes

How Colour Works

Colour is a powerful thing. It can identify a brand, imply eco-friendliness, gender a toy, raise our blood pressure, calm us down. But what exactly is colour? How and why do we see it? And how do cameras record it? Let’s find out.

 

The Meaning of “Light”

One of the many weird and wonderful phenomena of our universe is the electromagnetic wave, an electric and magnetic oscillation which travels at 186,000 miles per second. Like all waves, EM radiation has the inversely-proportional properties of wavelength and frequency, and we humans have devised different names for it based on these properties.

The electromagnetic spectrum

EM waves with a low frequency and therefore a long wavelength are known as radio waves or, slightly higher in frequency, microwaves; we used them to broadcast information and heat ready-meals. EM waves with a high frequency and a short wavelength are known as x-rays and gamma rays; we use them to see inside people and treat cancer.

In the middle of the electromagnetic spectrum, sandwiched between infrared and ultraviolet, is a range of frequencies between 430 and 750 terahertz (wavelengths 400-700 nanometres). We call these frequencies “light”, and they are the frequencies which the receptors in our eyes can detect.

If your retinae were instead sensitive to electromagnetic radiation of between 88 and 91 megahertz, you would be able to see BBC Radio 2. I’m not talking about magically seeing into Ken Bruce’s studio, but perceiving the FM radio waves which are encoded with his silky-smooth Scottish brogue. Since radio waves can pass through solid objects though, perceiving them would not help you to understand your environment much, whereas light waves are absorbed or reflected by most solid objects, and pass through most non-solid objects, making them perfect for building a picture of the world around you.

Within the range of human vision, we have subdivided and named smaller ranges of frequencies. For example, we describe light of about 590-620nm as “orange”, and below about 450nm as “violet”. This is all colour really is: a small range of wavelengths (or frequencies) of electromagnetic radiation, or a combination of them.

 

In the eye of the beholder

Scanning electron micrograph of a retina

The inside rear surfaces of your eyeballs are coated with light-sensitive cells called rods and cones, named for their shapes.

The human eye has about five or six million cones. They come in three types: short, medium and long, referring to the wavelengths to which they are sensitive. Short cones have peak sensitivity at about 420nm, medium at 530nm and long at 560nm, roughly what we call blue, green and red respectively. The ratios of the three cone types vary from person to person, but short (blue) ones are always in the minority.

Rods are far more numerous – about 90 million per eye – and around a hundred times more sensitive than cones. (You can think of your eyes as having dual native ISOs like a Panasonic Varicam, with your rods having an ISO six or seven stops faster than your cones.) The trade-off is that they are less temporally and spatially accurate than cones, making it harder to see detail and fast movement with rods. However, rods only really come into play in dark conditions. Because there is just one type of rod, we cannot distinguish colours in low light, and because rods are most sensitive to wavelengths of 500nm, cyan shades appear brightest. That’s why cinematographers have been painting night scenes with everything from steel grey to candy blue light since the advent of colour film.

The spectral sensitivity of short (blue), medium (green) and long (red) cones

The three types of cone are what allow us – in well-lit conditions – to have colour vision. This trichromatic vision is not universal, however. Many animals have tetrachromatic (four channel) vision, and research has discovered some rare humans with it too. On the other hand, some animals, and “colour-blind” humans, are dichromats, having only two types of cone in their retinae. But in most people, perceptions of colour result from combinations of red, green and blue. A combination of red and blue light, for example, appears as magenta. All three of the primaries together make white.

Compared with the hair cells in the cochlea of your ears, which are capable of sensing a continuous spectrum of audio frequencies, trichromacy is quite a crude system, and it can be fooled. If your red and green cones are triggered equally, for example, you have no way of telling whether you are seeing a combination of red and green light, or pure yellow light, which falls between red and green in the spectrum. Both will appear yellow to you, but only one really is. That’s like being unable to hear the difference between, say, the note D and a combination of the notes C and E. (For more info on these colour metamers and how they can cause problems with certain types of lighting, check out Phil Rhode’s excellent article on Red Shark News.)

 

Artificial eye

A Bayer filter

Mimicking your eyes, video sensors also use a trichromatic system. This is convenient because it means that although a camera and TV can’t record or display yellow, for example, they can produce a mix of red and green which, as we’ve just established, is indistinguishable from yellow to the human eye.

Rather than using three different types of receptor, each sensitive to different frequencies of light, electronic sensors all rely on separating different wavelengths of light before they hit the receptors. The most common method is a colour filter array (CFA) placed immediately over the photosites, and the most common type of CFA is the Bayer filter, patented in 1976 by an Eastman Kodak employee named Dr Bryce Bayer.

The Bayer filter is a colour mosaic which allows only green light through to 50% of the photosites, only red light through to 25%, and only blue to the remaining 25%. The logic is that green is the colour your eyes are most sensitive to overall, and that your vision is much more dependent on luminance than chrominance.

A RAW, non-debayered image

The resulting image must be debayered (or more generally, demosaiced) by an algorithm to produce a viewable image. If you’re recording log or linear then this happens in-camera, whereas if you’re shooting RAW it must be done in post.

This system has implications for resolution. Let’s say your sensor is 2880×1620. You might think that’s the number of pixels, but strictly speaking it isn’t. It’s the number of photosites, and due to the Bayer filter no single one of those photosites has more than a third of the necessary colour information to form a pixel of the final image. Calculating that final image – by debayering the RAW data – reduces the real resolution of the image by 20-33%. That’s why cameras like the Arri Alexa or the Blackmagic Cinema Camera shoot at 2.8K or 2.5K, because once it’s debayered you’re left with an image of 2K (cinema standard) resolution.

 

colour Compression

Your optic nerve can only transmit about one percent of the information captured by the retina, so a huge amount of data compression is carried out within the eye. Similarly, video data from an electronic sensor is usually compressed, be it within the camera or afterwards. Luminance information is often prioritised over chrominance during compression.

Examples of chroma subsampling ratios

You have probably come across chroma subsampling expressed as, for example, 444 or 422, as in ProRes 4444 (the final 4 being transparency information, only relevant to files generated in postproduction) and ProRes 422. The three digits describe the ratios of colour and luminance information: a file with 444 chroma subsampling has no colour compression; a 422 file retains colour information only in every second pixel; a 420 file, such as those on a DVD or BluRay, contains one pixel of blue info and one of red info (the green being derived from those two and the luminance) to every four pixels of luma.

Whether every pixel, or only a fraction of them, has colour information, the precision of that colour info can vary. This is known as bit depth or colour depth. The more bits allocated to describing the colour of each pixel (or group of pixels), the more precise the colours of the image will be. DSLRs typically record video in 24-bit colour, more commonly described as 8bpc or 8 bits per (colour) channel. Images of this bit depth fall apart pretty quickly when you try to grade them. Professional cinema cameras record 10 or 12 bits per channel, which is much more flexible in postproduction.

CIE diagram showing the gamuts of three video standards. D65 is the standard for white.

The third attribute of recorded colour is gamut, the breadth of the spectrum of colours. You may have seen a CIE (Commission Internationale de l’Eclairage) diagram, which depicts the range of colours perceptible by human vision. Triangles are often superimposed on this diagram to illustrate the gamut (range of colours) that can be described by various colour spaces. The three colour spaces you are most likely to come across are, in ascending order of gamut size: Rec.709, an old standard that is still used by many monitors; P3, used by digital cinema projectors; and Rec.2020. The latter is the standard for ultra-HD, and Netflix are already requiring that some of their shows are delivered in it, even though monitors capable of displaying Rec.2020 do not yet exist. Most cinema cameras today can record images in Rec.709 (known as “video” mode on Blackmagic cameras) or a proprietary wide gamut (“film” mode on a Blackmagic, or “log” on others) which allows more flexibility in the grading suite. Note that the two modes also alter the recording of luminance and dynamic range.

To summarise as simply as possible: chroma subsampling is the proportion of pixels which have colour information, bit depth is the accuracy of that information and gamut is the limits of that info.

That’s all for today. In future posts I will look at how some of the above science leads to colour theory and how cinematographers can make practical use of it.

SaveSave

How Colour Works

A History of Black and White

The contact sheet from my first roll of Ilford Delta 3200

Having lately shot my first roll of black-and-white film in a decade, I thought now would be a good time to delve into the story of monochrome image-making and the various reasons artists have eschewed colour.

I found the recent National Gallery exhibition, Monochrome: Painting in Black and White, a great primer on the history of the unhued image. Beginning with examples from medieval religious art, the exhibition took in grisaille works of the Renaissance before demonstrating the battle between painting and early photography, and finishing with monochrome modern art.

Several of the pictures on display were studies or sketches which were generated in preparation for colour paintings. Ignoring hue allowed the artists to focus on form and composition, and this is still one of black-and-white’s great strengths today: stripping away chroma to heighten other pictorial effects.

“Nativity” by Petrus Christus, c. 1455

What fascinated me most in the exhibition were the medieval religious paintings in the first room. Here, old testament scenes in black-and-white were painted around a larger, colour scene from the new testament; as in the modern TV trope, the flashbacks were in black-and-white. In other pictures, a colour scene was framed by a monochrome rendering of stonework – often incredibly realistic – designed to fool the viewer into thinking they were seeing a painting in an architectural nook.

During cinema’s long transition from black-and-white to colour, filmmakers also used the two modes to define different layers of reality. When colour processes were still in their infancy and very expensive, filmmakers selected particular scenes to pick out in rainbow hues, while the surrounding material remained in black-and-white like the borders of the medieval paintings. By 1939 the borders were shrinking, as The Wizard of Oz portrayed Kansas, the ordinary world, in black-and-white, while rendering Oz – the bulk of the running time – in colour.

Michael Powell, Emeric Pressburger and legendary Technicolor cinematographer Jack Cardiff, OBE, BSC subverted expectations with their 1946 fantasy-romance A Matter of Life and Death, set partly on Earth and partly in heaven. Says Cardiff in his autobiography:

Quite early on I had said casually to Michael Powell, “Of course heaven will be in colour, won’t it?” And Michael replied, “No. Heaven will be in black and white.” He could see I was startled, and grinned: “Because everyone will expect heaven to be in colour, I’m doing it in black-and-white.”

Ironically Cardiff had never shot in black-and-white before, and he ultimately captured the heavenly scenes on three-strip Technicolor, but didn’t have the colour fully developed, resulting in a pearlescent monochrome.

Meanwhile, DPs like John Alton, ASC were pushing greyscale cinematography to its apogee with a genre that would come to be known as film noir. Oppressed Jews like Alton fled the rising Nazism of Europe for the US, bringing German Expressionism with them. The result was a trend of hardboiled thrillers lit with oppressive contrast, harsh shadows, concealing silhouettes and dramatic angles, all of which were heightened by the lack of distracting colour.

A classic bit of Alton's noir lighting from The Big Combo
“The Big Combo” DP: John Alton, ASC

Alton himself had a paradoxical relationship with chroma, famously stating that “black and white are colours”. While he is best known today for his noir, his only Oscar win was for his work on the Technicolor musical An American in Paris, the designers of which hated Alton for the brightly-coloured light he tried to splash over their sets and costumes.

It wasn’t just Alton that was moving to colour. Soon the economics were clear: chromatic cinema was more marketable and no longer prohibitively expensive. The writing was on the wall for black-and-white movies, and by the end of the sixties they were all but gone.

I was brought up in a world of default colour, and the first time I can remember becoming aware of black-and-white was when Schindler’s List was released in 1993. I can clearly recall a friend’s mother refusing to see the film because she felt she wouldn’t be getting her money’s worth if there was no colour. She’s not alone in this view, and that’s why producers are never keen to green-light monochrome movies. Spielberg only got away with it because his name was proven box office gold.

“Schindler’s List” DP: Janusz Kamiński, ASC

A few years later, Jonathan Frakes and his DP Matthew F. Leonetti, ASC wanted to shoot the holodeck sequence of Star Trek: First Contact in black-and-white, but the studio deemed test footage “too experimental”. For the most part, the same attitude prevails today. Despite being marketed as a “visionary” director ever since Pan’s Labyrinth, Guillermo del Toro’s vision of The Shape of Water as a black-and-white film was rejected by financiers. He only got the multi-Oscar-winning fairytale off the ground by reluctantly agreeing to shoot in colour.

Yet there is reason to be hopeful about black-and-white remaining an option for filmmakers. In 2007 MGM denied Frank Darabont the chance to make The Mist in black-and-white, but they permitted a desaturated version on the DVD. Darabont had this to say:

No, it doesn’t look real. Film itself [is a] heightened recreation of reality. To me, black-and-white takes that one step further. It gives you a view of the world that doesn’t really exist in reality and the only place you can see that representation of the world is in a black-and-white movie.

“The Mist” DP: Rohn Schmidt

In 2016, a “black and chrome” version of Mad Max: Fury Road was released on DVD and Blu-Ray, with director George Miller saying:

The best version of “Road Warrior” [“Mad Max 2”]  was what we called a “slash dupe,” a cheap, black-and-white version of the movie for the composer. Something about it seemed more authentic and elemental. So I asked Eric Whipp, the [“Fury Road”] colourist, “Can I see some scenes in black-and-white with quite a bit of contrast?” They looked great. So I said to the guys at Warners, “Can we put a black-and-white version on the DVD?”

One of the James Mangold photos which inspired “Logan Noir”

The following year, Logan director James Mangold’s black-and-white on-set photos proved so popular with the public that he decided to create a monochrome version of the movie. “The western and noir vibes of the film seemed to shine in the form, and there was not a trace of the modern comic hero movie sheen,” he said. Most significantly, the studio approved a limited theatrical release for Logan Noir, presumably seeing the extra dollar-signs of a second release, rather than the reduced dollar-signs of a greyscale picture.

Perhaps the medium of black-and-white imaging has come full circle. During the Renaissance, greyscale images were preparatory sketches, stepping stones to finished products in colour. Today, the work-in-progress slash dupe of Road Warrior and James Mangold’s photographic studies of Logan were also stepping stones to colour products, while at the same time closing the loop by inspiring black-and-white products too.

With the era of budget- and technology-mandated monochrome outside the living memory of many viewers today, I think there is a new willingness to accept black-and-white as an artistic choice. The acclaimed sci-fi anthology series Black Mirror released an episode in greyscale this year, and where Netflix goes, others are bound to follow.

A History of Black and White

Roger Deakins’ Oscar-winning Cinematography of “Blade Runner 2049”

After fourteen nominations, celebrated cinematographer Roger Deakins, CBE, BSC, ASC finally won an Oscar last night, for his work on Denis Villeneuve’s Blade Runner 2049. Villeneuve’s sequel to Ridley Scott’s 1982 sci-fi noir is not a perfect film; its measured, thoughtful pace is not to everyone’s taste, and it has serious issues with women – all of the female characters being highly sexualised, callously slaughtered, or both – but the Best Cinematography Oscar was undoubtedly well deserved. Let’s take a look at the photographic style Deakins employed, and how it plays into the movie’s themes.

Blade Runner 2049 returns to the dystopian metropolis of Ridley Scott’s classic three decades later, introducing us to Ryan Gosling’s K. Like Harrison Ford’s Deckard before him, K is a titular Blade Runner, tasked with locating and “retiring” rogue replicants – artificial, bio-engineered people. He soon makes a discovery which could have huge implications both for himself and the already-strained relationship between humans and replicants. In his quest to uncover the truth, K must track down Deckard for some answers.

Villeneuve’s film meditates on deep questions of identity, creating a world in which you can never be sure who is or isn’t real – or even what truly constitutes being “real”. Deakins reinforces this existential uncertainty by reducing characters and locations to mere forms. Many scenes are shrouded in smog, mist, rain or snow, rendering humans and replicants alike as silhouettes.

K spends his first major scene seated in front of a window, the side-light bouncing off a nearby cabinet the only illumination on his face. Deakins’ greatest strength is his ability to adapt to whatever style each film requires, but if he has a recognisable signature it’s this courage to rely on a single source and let the rest of the frame go black.

Whereas Scott and his DP Jordan Cronenweth portrayed LA mainly at night, ablaze with pinpoints of light, Villeneuve and Deakins introduce it in daylight, but a daylight so dim and smog-ridden that it reveals even less than those night scenes from 1982.

All this is not to say that the film is frustratingly dark, or that audiences will struggle to make out what is going on. Shooting crisply on Arri Alexas with Arri/Zeiss Master Primes, Deakins is a master of ensuring that you see what you need to see.

A number of the film’s sequences are colour-coded, delineating them as separate worlds. The city is mainly fluorescent blues and greens, visually reinforcing the sickly state of society, with the police department – an attempt at justice in an insane world – a neutral white.

The Brutalist headquarters of Jared Leto’s blind entrepreneur Wallace are rendered in gold, as though the corporation attempted a friendly yellow but was corrupted by greed. These scenes also employ rippling reflections from pools of water. Whereas the watery light in the Tyrell HQ of Scott’s Blade Runner was a random last-minute idea by the director, concerned that his scene lacked enough interest and production value, here the light is clearly motivated by architectural water features. Yet it is used symbolically too, and very effectively so, as it underscores one of Blade Runner 2049’s most powerful scenes. At a point in the story where more than one character is calling their memories into question, the ripples playing across the walls are as intangible and illusory as those recollections. “I know what’s real,” Deckard asserts to Wallace, but both the photography and Ford’s performance bely his words.

The most striking use of colour is the sequence in which K first tracks Deckard down, hiding out in a Las Vegas that’s been abandoned since the detonation of a dirty bomb. Inspired by photos of the Australian dust storm of 2009, Deakins bathed this lengthy sequence in soft, orangey-red – almost Martian – light. This permeating warmth, contrasting with the cold artificial light of LA, underlines the personal nature of K’s journey and the theme of birth which is threaded throughout the film.

Deakins has stated in interviews that he made no attempt to emulate Cronenweth’s style of lighting, but nonetheless this sequel feels well-matched to the original in many respects. This has a lot to do with the traditional camerawork, with most scenes covered in beautifully composed static shots, and movement accomplished where necessary with track and dolly.

The visual effects, which bagged the film’s second Oscar, also drew on techniques of the past; the above featurette shows a Canon 1DC tracking through a miniature landscape at 2:29. “Denis and I wanted to do as much as possible in-camera,” Deakins told Variety, “and we insisted when we had the actors, at least, all the foreground and mid-ground would be in-camera.” Giant LED screens were used to get authentic interactive lighting from the advertising holograms on the city streets.

One way in which the lighting of the two Blade Runner movies is undeniably similar is the use of moving light sources to suggest an exciting world continuing off camera. (The infamous lens flares of J.J. Abrahms’ Star Trek served the same purpose, illustrating Blade Runner’s powerful influence on the science fiction genre.) But whereas, in the original film, the roving searchlights pierce the locations sporadically and intrusively, the dynamic lights of Blade Runner 2049 continually remodel the actors’ faces. One moment a character is in mysterious backlight, the next in sinister side-light, and the next in revealing front-light – inviting the audience to reassess who these characters are at every turn.

This obfuscation and transience of identity and motivation permeates the whole film, and is its core visual theme. The 1982 Blade Runner was a deliberate melding of sci-fi and film noir, but to me the sequel does not feel like noir at all. Here there is little hard illumination, no binary division of light and dark. Instead there is insidious soft light, caressing the edge of a face here, throwing a silhouette there, painting everyone on a continuous (and continuously shifting) spectrum between reality and artificiality.

Blade Runner 2049 is a much deeper and more subtle film than its predecessor, and Deakins’ cinematography beautifully reflects this.

Roger Deakins’ Oscar-winning Cinematography of “Blade Runner 2049”

Grading “Above the Clouds”

Recently work began on colour grading Above the Clouds, a comedy road movie I shot for director Leon Chambers. I’ve covered every day of shooting here on my blog, but the story wouldn’t be complete without an account of this crucial stage of postproduction.

I must confess I didn’t give much thought to the grade during the shoot, monitoring in Rec.709 and not envisaging any particular “look”. So when Leon asked if I had any thoughts or references to pass on to colourist Duncan Russell, I had to put my thinking cap on. I came up with a few different ideas and met with Leon to discuss them. The one that clicked with his own thoughts was a super-saturated vintage postcard (above). He also liked how, in a frame grab I’d been playing about with, I had warmed up the yellow of the car – an important character in the movie!

Leon was keen to position Above the Clouds‘ visual tone somewhere between the grim reality  of a typical British drama and the high-key gloss of Hollywood comedies. Finding exactly the right spot on that wide spectrum was the challenge!

“Real but beautiful” was Duncan’s mantra when Leon and I sat down with him last week for a session in Freefolk’s Baselight One suite. He pointed to the John Lewis “Tiny Dancer” ad as a good touchstone for this approach.

We spent the day looking at the film’s key sequences. There was a shot of Charlie, Oz and the Yellow Peril (the car) outside the garage from week one which Duncan used to establish a look for the three characters. It’s commonplace nowadays to track faces and apply individual grades to them, making it possible to fine-tune skin-tones with digital precision. I’m pleased that Duncan embraced the existing contrast between Charlie’s pale, freckled innocence and Oz’s dirty, craggy world-weariness.

Above the Clouds was mainly shot on an Alexa Mini, in Log C ProRes 4444, so there was plenty of detail captured beyond the Rec.709 image that I was (mostly) monitoring. A simple example of this coming in useful is the torchlight charity shop scene, shot at the end of week two. At one point Leo reaches for something on a shelf and his arm moves right in front of his torch. Power-windowing Leo’s arm, Duncan was able to bring back the highlight detail, because it had all been captured in the Log C.

But just because all the detail is there, it doesn’t mean you can always use it. Take the gallery scenes, also shot in week two, at the Turner Contemporary in Margate. The location has large sea-view windows and white walls. Many of the key shots featured Oz and Charlie with their backs towards the windows. This is a classic contrasty situation, but I knew from checking the false colours in log mode that all the detail was being captured.

Duncan initially tried to retain all the exterior detail in the grade, by separating the highlights from the mid-tones and treating them differently. He succeeded, but it didn’t look real. It looked like Oz and Charlie were green-screened over a separate background. Our subconscious minds know that a daylight exterior cannot be only slightly brighter than an interior, so it appeared artificial. It was necessary to back off on the sky detail to keep it feeling real. (Had we been grading in HDR [High Dynamic Range], which may one day be the norm, we could theoretically have retained all the detail while still keeping it realistic. However, if what I’ve heard of HDR is correct, it may have been unpleasant for audiences to look at Charlie and Oz against the bright light of the window beyond.)

There were other technical challenges to deal with in the film as well. One was the infra-red problem we encountered with our ND filters during last autumn’s pick-ups, which meant that Duncan had to key out Oz’s apparently pink jacket and restore it to blue. Another was the mix of formats employed for the various pick-ups: in addition to the Alexa Mini, there was footage from an Arri Amira, a Blackmagic Micro Cinema Camera (BMMCC) and even a Canon 5D Mk III. Although the latter had an intentionally different look, the other three had to match as closely as possible.

A twilight scene set in a rural village contains perhaps the most disparate elements. Many shots were done day-for-dusk on the Alexa Mini in Scotland, at the end of week four. Additional angles were captured on the BMMCC in Kent a few months later, both day-for-dusk and dusk-for-dusk. This outdoor material continues directly into indoor scenes, shot on a set this February on the Amira. Having said all that, they didn’t match too badly at all, but some juggling was required to find a level of darkness that worked for the whole sequence while retaining consistency.

In other sequences, like the ones in Margate near the start of the film, a big continuity issue is the clouds. Given the film’s title, I always tried to frame in plenty of sky and retain detail in it, using graduated ND filters where necessary. Duncan was able to bring out, suppress or manipulate detail as needed, to maintain continuity with adjacent shots.

Consistency is important in a big-picture sense too. One of the last scenes we looked at was the interior of Leo’s house, from weeks two and three, for which Duncan hit upon a nice, painterly grade with a bit of mystery to it. The question is, does that jar with the rest of the movie, which is fairly light overall, and does it give the audience the right clues about the tone of the scene which will unfold? We may not know the answers until we watch the whole film through.

Duncan has plenty more work to do on Above the Clouds, but I’m confident it’s in very good hands. I will probably attend another session when it’s close to completion, so watch this space for that.

See all my Above the Clouds posts here, or visit the official website.

Grading “Above the Clouds”

Lighting I Like: “Preacher”

Preacher is the subject of this week’s episode of Lighting I Like. I discuss two scenes, from the second episode of the second season, “Mumbai Sky Tower”, which demonstrate the over-the-top, comic-book style of the show.

Both seasons of Preacher can be seen on Amazon Video in the UK.

New episodes of Lighting I Like are released at 8pm BST every Wednesday. Next week I’ll look at two scenes from BroadchurchClick here to see the playlist of all Lighting I Like episodes.

Lighting I Like: “Preacher”

12 Tips for Better Instagram Photos

I joined this social media platform last summer, after hearing DP Ed Moore say in an interview that his Instagram feed helps him get work. I can’t say that’s happened for me yet, but an attractive Instagram feed can’t do any creative freelancer any harm. And for photographers and cinematographers, it’s a great way to practice our skills.

The tips below are primarily aimed at people who are using a phone camera to take their pictures, but many of them will apply to all types of photography.

The particular challenge with Instagram images is that they’re usually viewed on a phone screen; they’re small, so they have to be easy for the brain to decipher. That means reducing clutter, keeping things bold and simple.

Here are twelve tips for putting this philosophy into practice. The examples are all taken from my own feed, and were taken with an iPhone 5, almost always using the HDR (High Dynamic Range) mode to get the best tonal range.

 

1. choose your background carefully

The biggest challenge I find in taking snaps with my phone is the huge depth of field. This makes it critical to have a suitable, non-distracting background, because it can’t be thrown out of focus. In the pub photo below, I chose to shoot against the blank pillar rather than against the racks of drinks behind the bar, so that the beer and lens mug would stand out clearly. For the Lego photo, I moved the model away from a messy table covered in multi-coloured blocks to use a red-only tray as a background instead.

 

2. Find Frames within frames

The Instagram filters all have a frame option which can be activated to give your image a white border, or a fake 35mm negative surround, and so on. An improvement on this is to compose your image so that it has a built-in frame. (I discussed frames within frames in a number of my recent posts on composition.)

 

3. try symmetrical composition

To my eye, the square aspect ratio of Instagram is not wide enough for The Rule of Thirds to be useful in most cases. Instead, I find the most arresting compositions are central, symmetrical ones.

 

4. Consider Shooting flat on

In cinematography, an impression of depth is usually desirable, but in a little Instagram image I find that two-dimensionality can sometimes work better. Such photos take on a graphical quality, like icons, which I find really interesting. The key thing is that 2D pictures are easier for your brain to interpret when they’re small, or when they’re flashing past as you scroll.

 

5. Look for shapes

Finding common shapes in a structure or natural environment can be a good way to make your photo catch the eye. In these examples I spotted an ‘S’ shape in the clouds and footpath, and an ‘A’ shape in the architecture.

 

6. Look for textures

Textures can add interest to your image. Remember the golden rule of avoiding clutter though. Often textures will look best if they’re very bold, like the branches of the tree against the misty sky here, or if they’re very close-up, like this cathedral door.

 

7. Shoot into the light

Most of you will not be lighting your Instagram pics artificially, so you need to be aware of the existing light falling on your subject. Often the strongest look is achieved by shooting towards the light. In certain situations this can create interesting silhouettes, but often there are enough reflective surfaces around to fill in the shadows so you can get the beauty of the backlight and still see the detail in your subject. You definitely need to be in HDR mode for this.

 

8. Look for interesting light

It’s also worth looking out for interesting light which may make a dull subject into something worth capturing. Nature provides interesting light every day at sunrise and sunset, so these are good times to keep an eye out for photo ops.

 

9. Use lens flare for interest

Photographers have been using lens flare to add an extra something to their pictures for decades, and certain science fiction movies have also been known to use (ahem) one or two. To avoid a flare being too overpowering, position your camera so as to hide part of the sun behind a foreground object. To get that anamorphic cinema look, wipe your finger vertically across your camera lens. The natural oils on your skin will cause a flare at 90° to the direction you wiped in. (Best not try this with that rented set of Master Primes though.)

 

10. Control your palette

Nothing gives an image a sense of unity and professionalism as quickly as a controlled colour palette. You can do this in-camera, like I did below by choosing the purple cushion to photograph the book on, or by adjusting the saturation and colour cast in the Photos app, as I did with the Canary Wharf image. For another example, see the Lego shot under point 3.

 

11. Wait for the right moment

Any good photographer knows that patience is a virtue. Waiting for pedestrians or vehicles to reach just the right spot in your composition before tapping the shutter can make the difference between a bold, eye-catching photo and a cluttered mess. In the below examples, I waited until the pedestrians (left) and the rowing boat and swans (right) were best placed against the background for contrast and composition before taking the shot.

 

12. Quality control

One final thing to consider: is the photo you’ve just taken worthy of your Instagram profile, or is it going to drag down the quality of your feed? If it’s not good, maybe you should keep it to yourself.

Check out my Instagram feed to see if you think I’ve broken this rule!

12 Tips for Better Instagram Photos