“The Little Mermaid”: Circus Cinematography

‘B’ cam 2nd AC Matt Dixon preps the camera on the crane’s Scorpio remote head. Photo: Tim Gill

The biggest set on The Little Mermaid was the circus, an area the size of a football pitch which was transformed into a period spectacle. The big top and many other pieces were driven across the country from LA, and during our first week of principal photography the art department were hard at work setting it all up and dressing it.

Today’s post is about how I lit the night exterior scenes on this huge set. To that end I’m going to focus on the two biggest shots in the sequence: a tracking shot outside the big top, and the crane-up which first reveals the circus to the audience. Below, as well as my diary entries from the shoot, you’ll find a little behind-the-scenes video I grabbed on my phone for the tracking shot, and a lighting plan for the crane shot.

 

Day 7

The two 18K HMI fresnels rigged on the condor

A couple of daylight pick-ups today, then we start setting up for a big night scene. The camera will dolly with Cam and Elle from the exit of the big top, past candy floss and ‘healing water’ stalls where bits of dialogue will happen, and finally reveal a ferris wheel in the distance. We block while the sun’s still up, and paint in the light as night sets in. We are trying to light most of it in a way that will also work for our big crane shot reveal of the circus later in the week, because repositioning large HMIs – especially the two 18Ks we’re flying on a condor (cherry-picker) – is very time-consuming. Inevitably it doesn’t quite work out that way, and one of the 18Ks has to die for now at least.

The ferris wheel is backlit by a 12K, with a little front-light from a 5K tungsten fresnel. Cross-light on the talent comes from the working 18K and a 6K on the opposite side of frame. Nine-light Maxi Brutes illuminate the tent from inside, some of that light spilling out onto the talent, while par cans uplight a row of banners outside. A 1K baby provides edge-light to the talent in their final position. A 300W fresnel inside the healing water wagon spills out, and a bare 40W globe inside the umbrella of the candy floss stall gives us a little glow there.

The final lamp to go up is another 300W fresnel, because the directors are concerned that the ’sold out’ sign on the healing water wagon isn’t clear enough. We end up firing it in from the front because there’s no time for anything else, but as always with front-light, I deeply regret it. Ideally we would have armed it out from the roof of the wagon to rake down the side of it.

Once the supporting artists are choreographed, the shot looks great.

A post shared by Neil Oseman (@neiloseman) on

 

Day 9

A Nine-light Maxi Brute inside the big top

We start with a big crane shot revealing the whole circus at night. For this shot we have the following lamps burning: 2 x 18Ks (on a condor), 1 x 12K, 1 x 4K – all those are providing moonlight or starlight, with varying degrees of blue gel on them; 2 x Nine-light Maxi Brutes making the big top glow from inside; a 5K spilling some orange glow on the background; 2 or 3 smaller tungsten units spilling out light from inside the smaller tents; and lighting the foreground, a 4×4 Kino and a 1K baby bounced off unbleached muslin. There are also numerous practicals on, including the lights on the ferris wheel, the illuminated ‘circus’ sign, several par can up-lighters, and about 7KW of fairy lights. Totalling over 80KW, it is easily the biggest lighting set-up of my career. Although the grip and electrical crew is relatively small given the scale of the set-up, they handle it with aplomb.

We have rebuilt our Giraffe crane to its maximum 31ft configuration, so we can swoop up over the entrance tent, past the ’circus’ sign, and reveal the twinkling string-lights of the midway leading to the big top, and the rides beyond.

Here’s a retrospective lighting plan for this crane shot (not to scale); click to enlarge it. Note that additional tents were added in postproduction, as you can see in the trailer.

Ideally we would have had two condors, with an 18K on each, and put one of them way back behind the trees, to maintain a consistent direction of moonlight, but budget and the practicalities of the location made this impossible.

The ‘A’ camera on the dolly, with the two 18Ks on the condor in the background

One thing that was a little different to my original plan was the hard 4K edging the roofs of the midway tents on the lefthand side. This was meant to be a pair of 6Ks firing through a diffusion frame, to get a much softer, less “sourcey” look than the hard “moonlight” from the 18Ks. But unfortunately both our 6Ks were malfunctioning.

Another change was the lighting of the midway itself. We had a tungsten helium balloon on the truck, which I had planned to float above the midway to provide warm ambience. As it turned out, the practical string lights, although only 40W each, were so numerous that they provided ample illumination in the centre of the frame.

Later on in production, I was chatting to one of the ADs about this scene and he expressed surprise at how well I had handled it, given that it was so much bigger than any lighting set-up I’d previously done. Honestly it never fazed me. Lighting is entirely scaleable; the principles are identical, whether your set is a small bathroom or a football pitch. I’d done so much night exterior in my career, I’d just never had the big toys I wanted before. I’ll let you in on a secret though: the only reason I knew to ask for 18Ks and a condor was from reading American Cinematographer!

In my next post I’ll discuss shooting some of Poppy Drayton’s key scenes as the eponymous mermaid, including her introduction inside the big top. Don’t forget that The Little Mermaid is currently showing in movie theatres across the US and on Amazon in the UK.

“The Little Mermaid”: Circus Cinematography

“The Little Mermaid”: Shooting Shirley

The Little Mermaid, an independent live-action take on the Hans Christian Andersen fairytale, is now showing in cinemas across the USA. To mark the release, over the next few weeks I’ll be posting a series of articles about my cinematography of the film, using extracts from the diary I kept during production.

In this first instalment I’ll focus on the “pre-shoot”, two days of capturing the present-day scenes, undertaken a few weeks before principal photography began. For these scenes, we were all very excited to be working with bona fide Hollywood royalty in the form of Shirley MacLaine. Since debuting in the 1955 Hitchcock comedy The Trouble with Harry (and winning a Golden Globe), Shirley’s career has taken in six Oscar nominations as well as a win for Terms of Endearment, plus an AFI Life Achievement Award, two Baftas, an Emmy and several more Golden Globes.

No pressure then….

 

Saturday

Shirley is installed at a five-star hotel in downtown Savannah for hair, make-up and wardrobe tests. Taking it easy at the studio, I get a call from the UPM telling me that Shirley wants to meet me. Nervously I transfer my lighting reference images (including screen grabs I gathered last week from her previous movies) to my iPad and await my car.

When I get to the hotel I bump into her and the rest of the crew in the hall. Plunging straight in, I shake her hand and introduce myself as “Neil Oseman, the DP”. Evidently not hearing that last bit, and presuming I’m a PA or possibly a fan, she looks me up and down and asks me who I am. I repeat that I am the director of photography. “You’re so young!” she exclaims, laughing at her mistake.

“Well, I’ve been doing this for fifteen years,” I reply, all too aware of how short my career is compared with hers.

“Which pictures? Tell me,” she says.

Again acutely aware that my credits list isn’t going to sound very impressive to her, I mention Heretiks, and Ren: The Girl with the Mark and mutter something about doing lots of features.

To my great relief she doesn’t press the point, instead asking what I think of the wig and make-up she’s wearing. I ask her to step into the daylight, and assure her that it looks good, but that I’d like to warm up her skin tone a little with the lighting, an idea she responds well to.

Satisfied, Shirley moves on to other things, and I hang out in a meeting room at the hotel drawing storyboards, until it’s time for a production meeting.

 

SUNDAY

The present-day scenes were shot on Arri Alexas using Zeiss Super Speed Mark I primes and an Angenieux Optimo zoom, diffused with Tiffen Soft FX filters.

I arrive on location before even the early crew call of 8am, with my gaffer Mike Horton. His and key grip Jason Batey’s teams have rigged a dark box around the beach house’s deck/balcony so we can shoot day-for-night interiors.

At 10am Shirley arrives, blocks the scene, then goes off to hair and makeup. We’re starting with close-ups of her, so the grip and electric teams come in and build a book light. (This is a V-shaped arrangement of bounce and diffusion material, resembling an open book, which greatly softens the light fired into it.) When we start to turn over and Shirley watches playback, I’m gratified to find she is very happy with how she looks on camera. We shoot out all her close-ups, then bring in the little girls playing opposite her and block the wide shots.

In the lefthand foreground here is the 2K source for the book light. In the top right you can see the diffusion frame it’s firing through, and you can just make out the poly or rag we attached to the wall to bounce the light back onto Shirley (in the white nightgown). The net in the upper centre is cutting some light off the background. The camera can just be seen on the right of the photo.

As time begins to crunch, I fall back on cross-backlighting as a quick no-brainer solution to get the wide shot looking good. It’s so important to have these lighting templates up your sleeve when the pressure’s on. (Later on in this blog series I’ll discuss the use of cross-backlighting in several other scenes in the movie.)

For a little while it looks like we might not make the day, but I suggest a way to maximise the beautiful beach view at twilight and get the story beats covered in one two-camera set-up. The shot feels like something out of a classic old movie. Shirley MacLaine walking off into the sunset! Everyone loves how it looks, including Shirley. The praise of an actor as experienced as her is high praise indeed, and it makes my day!

 

Monday

At the monitors with producer Rob Molloy. Photo: Brooks Patrick Allen

We start lighting for our “sunset” scene, which involves firing a pink-gelled 6K through the window and netting the background to get some highlight detail into it. Rather than a book light, this time I use a diffused 4×4 Kino Flo as Shirley’s key. I take a risk and place it further off to the side to get a bit more shape into the light.

Shirley enters, takes one glance at the lighting and remarks, “So, you like this cross-light, huh?”

Busted!

We compromise by adding a little fill from a reflector which Shirley positions herself before each take. Her awareness of how she’s being photographed is astounding. She knows more about lighting than some DPs I’ve met!

Looking at the scenes now, I realise that a large white horizontal reflector in front of Shirley would have been perfect to simulate bounce off the bed, which we moved out when we were shooting the close-ups. Hindsight is 20/20, but I’m still pleased with how it turned out.

Next week I’ll break down the huge lighting set-up required for the night exterior circus scenes.

“The Little Mermaid”: Shooting Shirley

6 Tips for Making DIY Lighting Look Pro

Good lighting can boost the production values of a film tremendously, making the difference between an amateur and a professional-looking piece. For filmmakers early in their careers, however, the equipment typically used to achieve these results can be prohibitively expensive. Far from the Hollywood productions attended by trucks full of lights, a micro-budget film may be unable to rent even a single HMI. Do not despair though, as there are ways to light scenes well without breaking the bank. Here are my top six tips for lighting on the cheap.

 

1. Make the most of natural light

Checking my compass at the stone circle
Guesstimating the sun path on location

The hardest shots to light without the proper equipment are wide shots. Where a fully-budgeted production would rig Maxi Brutes on cherry-pickers, or pound HMIs through windows, a filmmaker of limited means simply won’t have access to the raw power of such fixtures. Instead, plan your day carefully to capture the wide shots at the time when natural light gives you the most assistance. For a day interior, this means shooting when the sun is on the correct side of the building.

See also: “Sun Paths”

 

2. Keep L.E.D.s to the background

£2 LED camping light
£2 LED camping light

There are a plethora of LED fixtures on the market, designed for all kinds of applications, some of them very reasonably priced. It might be tempting to purchase some of these to provide your primary illumination, but I advise against it. Cheap LED units (and fluorescents) have a terrible Colour Rendering Index (CRI), making for unnatural and unappealing skintones. Such units are therefore best restricted to backgrounds, accent lighting and “specials”. For example, I purchased a little LED camping light from a charity shop for about £2, and I often use it to create the blue glow from computer screens or hang it from the ceiling to produce a hint of hair-light.

See also my article on LEDs from my “Know Your Lights” series.

 

3. Key with tungsten or halogen

Worklight
Halogen floodlight

By far the best solution for a high output, high CRI, low cost key is a halogen floodlight; 500W models are available for as little as £5. Their chief disadvantage is the lack of barn doors, making the light hard to control, though if you can stretch to a roll of black wrap you can fashion a kind of snoot. Alternatively, consider investing in a secondhand tungsten movie fixture. With many people switching to LEDs, there are plenty of old tungsten units out there. Try to get a reputable brand like Arri or Ianiro, as some of the unbranded units available on Ebay are poorly wired and can be unsafe.

See also: “DIY Interview Lighting for the ‘Ren’ EPK”

 

4. Control the light

Lace curtains used to break up light in a Camerimage workshop last year

Flooding a halogen light onto a scene is never going to look good, but then the same is often true of dedicated movie fixtures. Instead it’s more how you modify the light that creates the nuanced, professional look. Improvise flags from pieces of cardboard to stop the light spilling into unwanted places – but be VERY careful how close you put them to a tungsten or halogen source, as these get extremely hot. For example, when shooting indoors, flag light off the background wall (especially if it’s white or cream) to help your subject stand out.

See also “Lighting Micro-sets” for an example of this.

 

5. Soften the light

Almost all cinematographers today prefer the subtlety of soft light to the harshness of hard light. You can achieve this by bouncing your fixture off a wall or ceiling, or a sheet of polystyrene or card. Or you could hang a white bedsheet or a shower curtain in front of the light as diffusion, but again be sure to leave a safe distance between them. Professional collapsible reflectors are available very cheaply online, and can be used in multiple ways to diffuse or reflect light.

Hot tub cover = bounce board
Hot tub cover = bounce board. Towel = flag

See also: “How to Soften Harsh Sunlight with Tinfoil and a Bedsheet”; and to read more about the pictured example: “Always Know Where Your Towel Is”

 

6. Make use of practicals

Black-wrapped ceiling light
Black-wrapped ceiling light

Finally, don’t be afraid to use existing practical lighting in your scene. Turning on the main overhead light usually kills the mood, but sometimes it can be useful. You can generate more contrast and shape by covering up the top of the lampshade, thus preventing ceiling bounce, or conversely use the ceiling bounce to give some ambient top-light and cover the bottom of the lampshade to prevent a harsh hotspot underneath it. Table lamps and under-cupboard kitchen lights can add a lot of interest and production value to your backgrounds. If possible, swap out LED or fluorescent bulbs for conventional tungsten ones for a more attractive colour and to eliminate potential flickering on camera.

See also: “5 Tips for Working with Practicals”, and for an example of the above techniques, my blog from day two of the Forever Alone shoot.

6 Tips for Making DIY Lighting Look Pro

What Does “Cinematic” Mean?

Earlier this year I undertook a personal photography project called Stasis. I deliberately set out to do something different to my cinematography work, shooting in portrait, taking the paintings of Dutch seventeenth century masters as my inspiration, and eschewing traditional lighting fixtures in favour of practical sources. I was therefore a little disappointed when I began showing the images to people and they described them as “cinematic”.

An image from “Stasis”

This experience made me wonder just what people mean by that word, “cinematic”. It’s a term I’ve heard – and used myself – many times during my career. We all seem to have some vague idea of what it means, but few of us are able to define it. 

Dictionaries are not much help either, with the Oxford English Dictionary defining it simply as “relating to the cinema” or “having qualities characteristic of films”. But what exactly are those qualities?

Shallow depth of field is certainly a quality that has been widely described as cinematic. Until the late noughties, shallow focus was the preserve of “proper” movies. The size of a 35mm frame (or of the digital cinema sensors which were then emerging) meant that backgrounds could be thrown way out of focus while the subject remained crisp and sharp. The formats which lower-budget productions had thereto been shot on – 2/3” CCDs and Super-16 film – could not achieve such an effect. 

Then the DSLR revolution happened, putting sensors as big as – or bigger than – those of Hollywood movies into the hands of anyone with a few hundred pounds to spare. Suddenly everyone could get that “cinematic” depth of field. 

My first time utilising the shallow depth of field of a DSLR, on a never-completed feature back in 2011.

Before long, of course, ultra-shallow depth of field became more indicative of a low-budget production trying desperately to look bigger than of something truly cinematic. Gradually young cinematographers started to realise that their idols chose depth of field for storytelling reasons, rather than simply using it because they could. Douglas Slocombe, OBE, BSC, ASC, cinematographer of the original Indiana Jones trilogy, was renowned for his deep depth of field, typically shooting at around T5.6, while Janusz Kaminski, ASC, when shooting Kingdom of the Crystal Skull, stopped down as far as T11.

There was also a time when progressive scan – the recording of discrete frames rather than alternately odd and even horizontal lines to make an interlaced image – was considered cinematic. Now it is standard in most types of production, although deviations from the norm of 24 or 25 frames per second, such as the high frame rate of The Hobbit, still make audiences think of reality TV or news, rejecting it as “uncinematic”.

Other distinctions in shooting style between TV/low-budget film and big-budget film have slipped away too. The grip equipment that enables “cinematic” camera movement – cranes, Steadicams and other stabilisers – is accessible now in some form to most productions. Meanwhile the multi-camera shooting which was once the preserve of TV, looked down upon by filmmakers, has spread into movie production.

A direct comparison may help us drill to the core of what is “cinematic”. Star Trek: Generations, the seventh instalment in the sci-fi film franchise, went into production in spring 1994, immediately after the final TV season of Star Trek: The Next Generation wrapped. The movie shot on the same sets, with the same cast and even the same acquisition format (35mm film) as the TV series. It was directed by David Carson, who had helmed several episodes of the TV series, and whose CV contained no features at that point.

Yet despite all these constants, Star Trek: Generations is more cinematic than the TV series which spawned it. The difference lies with the cinematographer, John A. Alonzo, ASC, one of the few major crew members who had not worked on the TV show, and whose experience was predominantly in features. I suspect he was hired specifically to ensure that Generations looked like a movie, not like TV.

The main thing that stands out to me when comparing the film and the series is the level of contrast in the images. The movie is clearly darker and moodier than the TV show. In fact I can remember my schoolfriend Chris remarking on this at the time – something along the lines of, “Now it’s a movie, they’re in space but they can only afford one 40W bulb to light the ship.” 

The bridge of the Enterprise D as seen on TV (top) and in the “Generations” movie (bottom).

It was a distinction borne of technical limitations. Cathode ray tube TVs could only handle a dynamic range of a few stops, requiring lighting with low contrast ratios, while a projected 35mm print could reproduce much more subtlety. 

Today, film and TV is shot on the same equipment, and both are viewed on a range of devices which are all good at dealing with contrast (at least compared with CRTs). The result is that, with contrast as with depth of field, camera movement and progressive scan, the distinction between the cinematic and the uncinematic has reduced. 

The cinematography of “Better Call Saul” owes much to film noir.

In fact, I’d argue that it’s flipped around. To my eye, many of today’s TV series – and admittedly I’m thinking of high-end ones like The Crown, Better Call Saul or The Man in the High Castle, not Eastenders – look more cinematic than modern movies. 

As my friend Chris had realised, the flat, high-key look of Star Trek: The Next Generation was actually far more realistic than that of its cinema counterpart. And now movies seem to have moved towards realism in the lighting, which is less showy and not so much moody for the sake of being moody, while TV has become more daring and stylised.

A typically moody and contrasty shot from “The Crown”

The Crown, for examples, blasts a 50KW Soft Sun through the window in almost every scene, bathing the monarchy in divine light to match its supposed divine right, while Better Call Saul paints huge swathes of rich, impenetrable black across the screen to represent the rotten soul of its antihero. 

Film lighting today seems to strive for naturalism in the most part. Top DPs like recent Oscar-winner Roger Deakins, CBE, ASC, BSC,  talk about relying heavily on practicals and using fewer movie fixtures, and fellow nominee Rachel Morrison, ASC, despite using a lot of movie fixtures, goes to great lengths to make the result look unlit. Could it be that film DPs feel they can be more subtle in the controlled darkness of a cinema, while TV DPs choose extremes to make their vision clear no matter what device it’s viewed on or how much ambient light contaminates it?

“Mudbound”, shot by Rachel Morrison, ASC

Whatever the reason, contrast does seem to be the key to a cinematic look. Even though that look may no longer be exclusive to movies released in cinemas, the perception of high contrast being linked to production value persists. The high contrast of the practically-lit scenes in my Stasis project is – as best I can tell – what makes people describe it as cinematic.

What does all of this mean for a filmmaker? Simply pumping up the contrast in the grade is not the answer. Contrast should be built into the lighting, and used to reveal and enhance form and depth. The importance of good production design, or at least good locations, should not be overlooked; shooting in a friend’s white-walled flat will kill your contrast and your cinematic look stone dead. 

A shot of mine from “Forever Alone”, a short film where I was struggling to get a cinematic look out of the white-walled location.

Above all, remember that story – and telling that story in the most visually appropriate way – is the essence of cinema. In the end, that is what makes a film truly cinematic.

SaveSave

What Does “Cinematic” Mean?

How Big a Light do I Need?

Experience goes a long way, but sometimes you need to be more precise about what size of lighting instruments are required for a particular scene. Night exteriors, for example; you don’t want to find out on the day that the HMI you hired as your “moon” backlight isn’t powerful enough to cover the whole of the car park you’re shooting in. How can you prep correctly so that you don’t get egg on your face?

There are two steps: 1. determine the intensity of light you require on the subject, and 2. find a combination of light fixture and fixture-to-subject distance that will provide that intensity.

 

The Required intensity

The goal here is to arrive at a number of foot-candles (fc). Foot-candles are a unit of light intensity, sometimes more formally called illuminance, and one foot-candle is the illuminance produced by a standard candle one foot away. (Illuminance can also be measured in the SI unit of lux, where 1 fc ≈ 10 lux, but in cinematography foot-candles are more commonly used. It’s important to remember that illuminance is a measure of the light incident to a surface, i.e. the amount of light reaching the subject. It is not to be confused with luminance, which is the amount of light reflected from a surface, or with luminous power, a.k.a. luminous flux, which is the total amount of light emitted from a source.)

Usually you start with a T-stop (or f-stop) that you want to shoot at, based on the depth of field you’d like. You also need to know the ISO and shutter interval (usually 1/48th or 1/50th of a second) you’ll be shooting at. Next you need to convert these facets of exposure into an illuminance value, and there are a few different ways of doing this.

One method is to use a light meter, if you have one, which you enter the ISO and shutter values into. Then you wave it around your office, living room or wherever, pressing the trigger until you happen upon a reading which matches your target f-stop. Then you simply switch your meter into foot-candles mode and read off the number. This method can be a bit of a pain in the neck, especially if – like mine – your meter requires fiddly flipping of dip-switches and additional calculations to get a foot-candles reading out of.

A much simpler method is to consult an exposure table, like the one below, or an exposure calculator, which I’m sure is a thing which must exist, but I’ll be damned if I could find one.

Some cinematographers memorise the fact that 100fc is f/2.8 at ISO 100, and work out other values from that. For example, ISO 400 is four times (two stops) faster than ISO 100, so a quarter of the light is required, i.e. 25fc.

Alternatively, you can use the underlying maths of the above methods. This is unlikely to be necessary in the real world, but for the purposes of this blog it’s instructive to go through the process. The equation is:

where

  • b is the illuminance in fc,
  • f is the f– or T-stop,
  • s is the shutter interval in seconds, and
  • i is the ISO.

Say I’m shooting on an Alexa with a Cooke S4 Mini lens. If I have the lens wide open at T2.8, the camera at its native ISO of 800 and the shutter interval at the UK standard of 1/50th (0.02) of a second…

… so I need about 12fc of light.

 

The right instrument

In the rare event that you’re actually lighting your set with candles – as covered in my Barry Lyndon and Stasis posts – then an illuminance value in fc is all you need. In every other situation, though, you need to figure out which electric light fixtures are going to give you the illuminance you need.

Manufacturers of professional lighting instruments make this quite easy for you, as they all provide data on the illuminance supplied by their products at various distances. For example, if I visit Mole Richardson’s webpage for their 1K Baby-Baby fresnel, I can click on the Performance Data table to see that this fixture will give me the 12fc (in fact slightly more, 15fc) that I required in my Alexa/Cooke example at a distance of 30ft on full flood.

Other manufacturers provide interactive calculators: on ETC’s site you can drag a virtual Source Four back and forth and watch the illuminance read-out change, while Arri offers a free iOS/Android app with similar functionality.

If you need to calculate an illuminance value for a distance not specified by the manufacturer, you can derive it from distances they do specify, by using the Inverse Square Law. However, as I found in my investigatory post about the law, that could be a whole can of worms.

If illuminance data is not available for your light source, then I’m afraid more maths is involved. For example, the room I’m currently in is lit by a bulb that came in a box marked “1,650 lumens”, which is the luminous power. One lumen is one foot-candle per square foot. To find out the illuminance, i.e. how many square feet those lumens are spread over, we imagine those square feet as the area of a sphere with the lamp at the centre, and where the radius r is the distance from the lamp to the subject. So:

where

  • is again the illuminance in fc,
  • is the luminous power of the souce in lumens, and
  • r is the lamp-to-subject distance in feet.

(I apologise for the mix of Imperial and SI units, but this is the reality in the semi-Americanised world of British film production! Also, please note that this equation is for point sources, rather than beams of light like you get from most professional fixtures. See this article on LED Watcher if you really want to get into the detail of that.)

So if I want to shoot that 12fc scene on my Alexa and Cooke S4 Mini under my 1,650 lumen domestic bulb…

… my subject needs to be 3’4″ from the lamp. I whipped out my light meter to check this, and it gave me the target T2.8 at 3’1″ – pretty close!

 

Do I have enough light?

If you’re on a tight budget, it may be less a case of, “What T-stop would I like to shoot at, and what fixture does that require?” and more a case of, “Is the fixture which I can afford bright enough?”

Let’s take a real example from Perplexed Music, a short film I lensed last year. We were shooting on an Alexa at ISO 1600, 1/50th sec shutter, and on Arri/Zeiss Ultra Primes, which have a maximum aperture of T1.9. The largest fixture we had was a 2.5K HMI, and I wanted to be sure that we would have enough light for a couple of night exteriors at a house location.

In reality I turned to an exposure table to find the necessary illuminance, but let’s do the maths using the first equation that we met in this post:

Loading up Arri’s photometrics app, I could see that 2.8fc wasn’t going to be a problem at all, with the 2.5K providing 5fc at the app’s maximum distance of 164ft.

That’s enough for today. All that maths may seem bewildering, but most of it is eliminated by apps and other online calculators in most scenarios, and it’s definitely worth going to the trouble of checking you have enough light before you’re on set with everyone ready to roll!

See also: 6 Ways of Judging Exposure

SaveSave

SaveSave

How Big a Light do I Need?

Colour Rendering Index

Many light sources we come across today have a CRI rating. Most of us realise that the higher the number, the better the quality of light, but is it really that simple? What exactly is Colour Rendering Index, how is it measured and can we trust it as cinematographers? Let’s find out.

 

What is C.R.I.?

CRI was created in 1965 by the CIE – Commission Internationale de l’Eclairage – the same body responsible for the colour-space diagram we met in my post about How Colour Works. The CIE wanted to define a standard method of measuring and rating the colour-rendering properties of light sources, particularly those which don’t emit a full spectrum of light, like fluorescent tubes which were becoming popular in the sixties. The aim was to meet the needs of architects deciding what kind of lighting to install in factories, supermarkets and the like, with little or no thought given to cinematography.

As we saw in How Colour Works, colour is caused by the absorption of certain wavelengths of light by a surface, and the reflection of others. For this to work properly, the light shining on the surface in the first place needs to consist of all the visible wavelengths. The graphs below shows that daylight indeed consists of a full spectrum, as does incandescent lighting (e.g. tungsten), although its skew to the red end means that white-balancing is necessary to restore the correct proportions of colours to a photographed image. (See my article on Understanding Colour Temperature.)

Fluorescent and LED sources, however, have huge peaks and troughs in their spectral output, with some wavelengths missing completely. If the wavelengths aren’t there to begin with, they can’t reflect off the subject, so the colour of the subject will look wrong.

Analysing the spectrum of a light source to produce graphs like this required expensive equipment, so the CIE devised a simpler method of determining CRI, based on how the source reflected off a set of eight colour patches. These patches were murky pastel shades taken from the Munsell colour wheel (see my Colour Schemes post for more on colour wheels). In 2004, six more-saturated patches were added.

The maths which is used to arrive at a CRI value goes right over my head, but the testing process boils down to this:

  1. Illuminate a patch with daylight (if the source being tested has a correlated colour temperature of 5,000K or above) or incandescent light (if below 5,000K).
  2. Compare the colour of the patch to a colour-space CIE diagram and note the coordinates of the corresponding colour on the diagram.
  3. Now illuminate the patch with the source being tested.
  4. Compare the new colour of the patch to the CIE diagram and note the coordinates of the corresponding colour.
  5. Calculate the distance between the two coordinates, i.e. the difference in colour under the two light sources.
  6. Repeat with the remaining patches and calculate the average difference.

Here are a few CRI ratings gleaned from around the web:

Source CRI
Sodium streetlight -44
Standard fluorescent 50-75
Standard LED 83
LitePanels 1×1 LED 90
Arri HMI 90+
Kino Flo 95
Tungsten 100 (maximum)

 

Problems with C.R.I.

There have been many criticisms of the CRI system. One is that the use of mean averaging results in a lamp with mediocre performance across all the patches scoring the same CRI as a lamp that does terrible rendering of one colour but good rendering of all the others.

Demonstrating the non-continuous spectrum of a fluorescent lamp, versus the continuous spectrum of incandescent, using a prism.

Further criticisms relate to the colour patches themselves. The eight standard patches are low in saturation, making them easier to render accurately than bright colours. An unscrupulous manufacturer could design their lamp to render the test colours well without worrying about the rest of the spectrum.

In practice this all means that CRI ratings sometimes don’t correspond to the evidence of your own eyes. For example, I’d wager that an HMI with a quoted CRI in the low nineties is going to render more natural skin-tones than an LED panel with the same rating.

I prefer to assess the quality of a light source by eye rather than relying on any quoted CRI value. Holding my hand up in front of an LED fixture, I can quickly tell whether the skin tones looks right or not. Unfortunately even this system is flawed.

The fundamental issue is the trichromatic nature of our eyes and of cameras: both work out what colour things are based on sensory input of only red, green and blue. As an analogy, imagine a wall with a number of cracks in it. Imagine that you can only inspect it through an opaque barrier with three slits in it. Through those three slits, the wall may look completely unblemished. The cracks are there, but since they’re not aligned with the slits, you’re not aware of them. And the “slits” of the human eye are not in the same place as the slits of a camera’s sensor, i.e. the respective sensitivities of our long, medium and short cones do not quite match the red, green and blue dyes in the Bayer filters of cameras. Under continuous-spectrum lighting (“smooth wall”) this doesn’t matter, but with non-continuous-spectrum sources (“cracked wall”) it can lead to something looking right to the eye but not on camera, or vice-versa.

 

Conclusion

Given its age and its intended use, it’s not surprising that CRI is a pretty poor indicator of light quality for a modern DP or gaffer. Various alternative systems exist, including GAI (Gamut Area Index) and TLCI (Television Lighting Consistency Index), the latter similar to CRI but introducing a camera into the process rather than relying solely on human observation. The Academy of Motion Picture Arts and Sciences recently invented a system, Spectral Similarity Index (SSI), which involves measuring the source itself with a spectrometer, rather than reflected light. At the time of writing, however, we are still stuck with CRI as the dominant quantitative measure.

So what is the solution? Test, test, test. Take your chosen camera and lens system and shoot some footage with the fixtures in question. For the moment at least, that is the only way to really know what kind of light you’re getting.

SaveSave

SaveSaveSaveSave

SaveSave

SaveSave

SaveSave

SaveSave

SaveSave

SaveSave

SaveSave

SaveSave

Colour Rendering Index

“The Knowledge”: Lighting a Multi-camera Game Show

Metering the key-light. Photo: Laura Radford

Last week I discussed the technical and creative decisions that went into the camerawork of The Knowledge, a fake game show for an art installation conceived by Ian Wolter and directed by Jonnie Howard. This week I’ll break down the choices and challenges involved in lighting the film.

The eighties quiz shows which I looked at during prep were all lit with the dullest, flattest light imaginable. It was only when I moved forward to the nineties shows which Jonnie and I grew up on, like Blockbusters and The Generation Game, that I started to see some creativity in the lighting design: strip-lights and glowing panels in the sets, spotlights and gobos on the backgrounds, and moodier lighting states for quick-fire rounds.

Jonnie and I both wanted The Knowledge‘s lighting to be closer to this nineties look. He was keen to give each team a glowing taxi sign on their desks, which would be the only source of illumination on the contestants at certain moments. Designer Amanda Stekly and I came up with plans for additional practicals – ultimately LED string-lights – that would follow the map-like lines in the set’s back walls.

Once the set design had been finalised, I did my own dodgy pencil sketch and Photoshopped it to create two different lighting previsualisations for Jonnie.

He felt that these were a little too sophisticated, so after some discussion I produced a revised previz…

…and a secondary version showing a lighting state with one team in shadow.

These were approved, so now it was a case of turning those images into reality.

We were shooting on a soundstage, but for budget reasons we opted not to use the lighting grid. I must admit that this worried me for a little while. The key-light needed to come from the front, contrary to normal principles of good cinematography, but very much in keeping with how TV game shows are lit. I was concerned that the light stands and the cameras would get in each others’ way, but my gaffer Ben Millar assured me it could be done, and of course he was right.

Ben ordered several five-section Strato Safe stands (or Fuck-offs as they’re charmingly known). These were so high that, even when placed far enough back to leave room for the cameras, we could get the 45° key angle which we needed in order to avoid seeing the contestants’ shadows on the back walls. (A steep key like this is sometimes known as a butterfly key, for the shape of the shadow which the subject’s nose casts on their upper lip.)  Using the barn doors, and double nets on friction arms in front of the lamp-heads, Ben feathered the key-light to hit as little as possible of the back walls and the fronts of the desks. As well as giving the light some shape, this prevented the practical LEDs from getting washed out.

Note the nets mounted below the key-lights (the tallest ones). Photo: Laura Radford

Once those key-lights were established (a 5K fresnel for each team), we set a 2K backlight for each team as well. These were immediately behind the set, their stands wrapped in duvetyne, and the necks well and truly broken to give a very toppy backlight. A third 2K was placed between the staggered central panels of the set, spilling a streak of light out through the gap from which host Robert Jezek would emerge.

A trio of Source Fours with 15-30mm zoom lenses were used for targeted illumination of certain areas. One was aimed at The Knowledge sign, its cutters adjusted to form a rectangle of light around it. Another was focused on the oval map on the floor, which would come into play during the latter part of the show. The last Source Four was used as a follow-spot on Robert. We had to dim it considerably to keep the exposure in range, which conveniently made him look like he had a fake tan! Ben hooked everything, in fact, up to a dimmer board, so that various lighting cues could be accomplished in camera.

The bulk of the film was recorded in a single day, following a day’s set assembly and a day of pre-rigging. A skeleton crew returned the next day to shoot pick-ups and promos, a couple of which you can see on Vimeo here.

I’ll leave you with some frame grabs from the finished film. Find out more about Ian Wolter’s work at ianwolter.com.

SaveSave

SaveSave

SaveSave

“The Knowledge”: Lighting a Multi-camera Game Show

Colour Schemes

Last week I looked at the science of colour: what it is, how our eyes see it, and how cameras see and process it. Now I’m going to look at colour theory – that is, schemes of mixing colours to produce aesthetically pleasing results.

 

The Colour wheel

The first colour wheel was drawn by Sir Isaac Newton in 1704, and it’s a precursor of the CIE diagram we met last week. It’s a method of arranging hues so that useful relationships between them – like primaries and secondaries, and the schemes we’ll cover below – can be understood. As we know from last week, colour is in reality a linear spectrum which we humans perceive by deducing it from the amounts of light triggering our red, green and blue cones, but certain quirks of our visual system make a wheel in many ways a more useful arrangement of the colours than a linear spectrum.

One of these quirks is that our long (red) cones, although having peak sensitivity to red light, have a smaller peak in sensitivity at the opposite (violet) end of the spectrum. This may be what causes our perception of colour to “wrap around”.

Another quirk is in the way that colour information is encoded in the retina before being piped along the optic nerve to the brain. Rather than producing red, green and blue signals, the retina compares the levels of red to green, and of blue to yellow (the sum of red and green cones), and sends these colour opponency channels along with a luminance channel to the brain.

You can test these opposites yourself by staring at a solid block of one of the colours for around 30 seconds and then looking at something white. The white will initially take on the opposing colour, so if you stared at red then you will see green.

Hering’s colour wheels

19th century physiologist Ewald Hering was the first to theorise about this colour opponency, and he designed his own colour wheel to match it, having red/green on the vertical axis and blue/yellow on the horizontal.

RGB colour wheel

Today we are more familiar with the RGB colour wheel, which spaces red, green and blue equally around the circle. But both wheels – the first dealing with colour perception in the eye-brain system, and the second dealing with colour representation on an RGB screen – are relevant to cinematography.

On both wheels, colours directly opposite each other are considered to cancel each other out. (In RGB they make white when combined.) These pairs are known as complementary colours.

 

Complementary

A complementary scheme provides maximum colour contrast, each of the two hues making the other more vibrant. Take “The Snail” by modernist French artist Henri Matisse, which you can currently see at the Tate Modern; Matisse placed complementary colours next to each other to make them all pop.

“The Snail” by Henri Matisse (1953)

In cinematography, a single pair of complementary colours is often used, for example the yellows and blues of Aliens‘ power loader scene:

“Aliens” DP: Adrian Biddle, BSC

Or this scene from Life on Mars which I covered on my YouTube show Lighting I Like:

I frequently use a blue/orange colour scheme, because it’s the natural result of mixing tungsten with cool daylight or “moonlight”.

“The First Musketeer”, DP: Neil Oseman

And then of course there’s the orange-and-teal grading so common in Hollywood:

“Hot Tub Time Machine” DP: Jack N. Green, ASC

Amélie uses a less common complementary pairing of red and green:

“Amélie” DP: Bruno Belbonnel, AFC, ASC

 

Analogous

An analogous colour scheme uses hues adjacent to each other on the wheel. It lacks the punch and vibrancy of a complementary scheme, instead having a harmonious, unifying effect. In the examples below it seems to enhance the single-mindedness of the characters. Sometimes filmmakers push analogous colours to the extreme of using literally just one hue, at which point it is technically monochrome.

“The Matrix” DP: Bill Pope, ASC
“Terminator 2: Judgment Day” DP: Adam Greenberg, ASC
“The Double” DP: Erik Alexander Wilson
“Total Recall” (1990) DP: Jost Vacano, ASC, BVK

 

There are other colour schemes, such as triadic, but complementary and analogous colours are by far the most common in cinematography. In a future post I’ll look at the psychological effects of individual colours and how they can be used to enhance the themes and emotions of a film.

SaveSave

Colour Schemes

Creating “Stasis”

Stasis is a personal photography project about time and light. You can view all the images here, and in this post I’ll take you through the technical and creative process of making them.

I got into cinematography directly through a love of movies and filmmaking, rather than from a fine art background. To plug this gap, over the past few of years I’ve been trying to give myself an education in art by going to galleries, and reading art and photography books. I’ve previously written about how JMW Turner’s work captured my imagination, but another artist whose work stood out to me was Gerrit (a.k.a. Gerard) Dou. Whereas most of the Dutch 17th century masters painted daylight scenes, Dou often portrayed people lit by only a single candle.

“A Girl Watering Plants” by Gerrit Dou

At around the same time as I discovered Dou, I researched and wrote a blog post about Barry Lyndon‘s groundbreaking candlelit scenes. This got me fascinated by the idea that you can correctly expose an image without once looking at a light meter or digital monitor, because tables exist giving the appropriate stop, shutter and ISO for any given light level… as measured in foot-candles. (One foot-candle is the amount of light received from a standard candle that is one foot away.)

So when I bought a 35mm SLR (a Pentax P30T) last autumn, my first thought was to recreate some of Dou’s scenes. It would be primarily an exercise in exposure discipline, training me to judge light levels and fall-off without recourse to false colours, histograms or any of the other tools available to a modern DP.

I conducted tests with Kate Madison, who had also agreed to furnish period props and costumes from the large collection which she had built up while making Born of Hope and Ren: The Girl with the Mark. Both the tests and the final images were captured on Fujifilm Superia X-tra 400. Ideally I would have tested multiple stocks, but I must confess that the costs of buying and processing several rolls were off-putting. I’d previously shot some basic latitude tests with Superia, so I had some confidence about what it could and couldn’t do. (It can be over-exposed at least five stops and still look good, but more than a stop under and it falls apart.) I therefore confined myself to experimenting with candle-to-subject distances, exposure times and filtration.

The tests showed that the concept was going to work, and also confirmed that I would need to use an 80B filter to cool the “white balance” of the film from its native daylight to tungsten (3400K). (As far as I can tell, tungsten-balanced stills film is no longer on the market.) Candlelight has a colour temperature of about 1800K, so it still reads as orange through an 80B, but without the filter it’s an ugly red.

Meanwhile, the concept had developed beyond simply recreating Gerrit Dou’s scenes. I decided to add a second character, contrasting the historical man lit only by his candle with a modern girl lit only by her phone. Flames have a hypnotic power, tapping into our ancient attraction to light, and today’s smartphones have a similarly powerful draw.

The candlelight was 1600K warmer than the filtered film, so I used an app called Colour Temp to set my iPhone to 5000K, making it 1600K cooler than the film; the phone would therefore look as blue as the candle looked orange. (Unfortunately my phone died quickly and I had trouble recharging it, so some of the last shots were done with Izzi’s non-white-balanced phone.) To match the respective colours of light, we dressed Ivan in earthy browns and Izzi in blues and greys.

Artemis recce image

We shot in St. John’s Church in Duxford, Cambridgeshire, which hasn’t been used as a place of worship since the mid-1800s. Unique markings, paintings and graffiti from the middle ages up to the present give it simultaneously a history and a timelessness, making it a perfect match to the clash of eras represented by my two characters. It resonated with the feelings I’d had when I started learning about art and realised the continuity of techniques and aims from me in my cinematography back through time via all the great artists of the past to the earliest cave paintings.

I knew from the tests that long exposures would be needed. Extrapolating from the exposure table, one foot-candle would require a 1/8th of a second shutter with my f1.4 lens wide open and the Fujifilm’s ISO of 400. The 80B has a filter factor of three, meaning you need three times more light, or, to put it another way, it cuts 1 and 2/3rds of a stop. Accounting for this, and the fact that the candle would often be more than a foot away, or that I’d want to see further into the shadows, the exposures were all at least a second long.

As time had become very much the theme of the project, I decided to make the most of these long exposures by playing with motion blur. Not only does this allow a static image – paradoxically – to show a passage of time, but it recalls 19th century photography, when faces would often blur during the long exposures required by early emulsions. Thus the history of photography itself now played a part in this time-fluid project.

I decided to shoot everything in portrait, to make it as different as possible from my cinematography work. Heavily inspired by all the classical art I’d been discovering, I used eye-level framing, often flat-on and framed architecturally with generous headroom, and a normal lens (an Asahi SMC Pentax-M 50mm/f1.4) to provide a natural field of view.

I ended up using my light meter quite a lot, though not necessarily exposing as it indicated. It was all educated guesswork, based on what the meter said and the tests I’d conducted.

I was tempted more than once to tell a definite story with the images, and had to remind myself that I was not making a movie. In the end I opted for a very vague story which can be interpreted many ways. Which of the two characters is the ghost? Or is it both of them? Are we all just ghosts, as transient as motion blur? Do we unwittingly leave an intangible imprint on the universe, like the trails of light my characters produce, or must we consciously carve our mark upon the world, as Ivan does on the wall?

Models: Izzi Godley & Ivan Moy. Stylist: Kate Madison. Assistant: Ash Maharaj. Location courtesy of the Churches Conservation Trust. Film processing and scanning by Aperture, London.

Creating “Stasis”

The Inverse Square Law

If you’ve ever read or been taught about lighting, you’ve probably heard of the Inverse Square Law. It states that light fades in proportion to the square of the distance from the source. But lately I started to wonder if this really applies in all situations. Join me as I attempt to get to the bottom of this…

 

Knowing the law

The seed of this post was sown almost a year ago, when I read Herbert McKay’s 1947 book The Tricks of Light and Colour, which described the Inverse Square Law in terms of light spreading out. (Check out my post about The Tricks of Light and Colour here.)

But before we go into that, let’s get the Law straight in our minds. What, precisely, does it say? Another excellent book, Gerald Millerson’s Lighting for Television and Film, defines it thusly:

With increased distance, the light emitted from a given point source will fall rapidly, as it spreads over a progressively larger area. This fall-off in light level is inversely proportional to the distance square, i.e. 1/d². Thus, doubling the lamp distance would reduce the light to ¼.

The operative word, for our purposes, is “spreads”.

If you’d asked me a couple of years ago what causes the Inverse Square Law, I probably would have mumbled something about light naturally losing energy as it travels. But that is hogwash of the highest order. Assuming the light doesn’t strike any objects to absorb it, there is nothing to reduce its energy. (Air does scatter – and presumably absorb – a very small amount of light, hence atmospheric haze, but this amount will never be significant on the scale a cinematographer deals with.)

In fact, as the Millerson quote above makes clear, the Inverse Square Law is a result of how light spreads out from its source. It’s purely geometry. In this diagram you can see how fewer and fewer rays strike the ‘A’ square as it gets further and further away from the source ‘S’:

Illustration by Borb, CC BY-SA 3.0

Each light ray (dodgy term, I know, but sufficient for our purposes) retains the same level of energy, and there are the same number of them overall, it’s just that there are fewer of them passing through any given area.

So far, so good.

 

Taking the Law into my own hands

During season two of my YouTube series Lighting I Like, I discussed Dedo’s Panibeam 70 HMI. This fixture produces collimated light, light of which all the waves are travelling in parallel. It occurred to me that this must prevent them spreading out, and therefore render the Inverse Square Law void.

This in turn got me thinking about more common fixtures – par cans, for example.

 

Par lamps are so named for the Parabolic Aluminised Reflectors they contain. These collect the light radiated from the rear and sides of the filament and reflect it as parallel rays. So to my mind, although light radiated from the very front of the filament must still spread and obey the Inverse Square Law, that which bounces off the reflector should theoretically never diminish. You can imagine that the ‘A’ square in our first diagram would have the same number of light rays passing through it every time if they are travelling in parallel.

Similarly, fresnel lenses are designed to divert the spreading light waves into a parallel pattern:

Even simple open-face fixtures have a reflector which can be moved back and forth using the flood/spot control, affecting both the spread and the intensity of the light. Hopefully by now you can see why these two things are related. More spread = more divergence of light rays = more fall-off. Less spread = less divergence of light rays = more throw.

So, I wondered, am I right? Do these focused sources disobey the Inverse Square Law?

 

Breaking the law

To find the answer, I waded through a number of fora.

Firstly, and crucially, everyone agrees that the Law describes light radiated from a point source, so any source which isn’t infinitely small will technically not be governed by the Law. In practice, says the general consensus, the results predicted by the Law hold true for most sources, unless they are quite large or very close to the subject.

If you are using a softbox, a Kinoflo or a trace frame at short range though, the Inverse Square Law will not apply.

The above photometric data for a Filmgear LED Flo-box indeed shows a slower fall-off than the Law predicts. (Based on the 1m intensity, the Law predicts the 2m and 3m intensities as 970÷2²=243 lux and 970÷3²=108 lux respectively.)

A Flickr forum contributor called Severin Sadjina puts it like this:

In general, the light will fall off as 1/d² if the size of the light source is negligible compared to the distance d to the light source. If, on the other hand, the light source is significantly larger than the distance d to the light source, the light will fall off as 1/d – in other words: slower than the Inverse Square Law predicts.

Another contributor, Ftir, claims that a large source will start to follow the Law above distances equal to about five times the largest side of the source, so a 4ft Kinoflo would obey the Law very closely after about 20ft. This claim is confirmed by Wikipedia, citing A. Ryer’s The Light Measurement Handbook.

But what about those pesky parallel light beams from the pars and fresnels?

Every forum had a lot of disagreement on this. Most people agree that parallel light rays don’t really exist in real life. They will always diverge or converge, slightly, and therefore the Law applies. However, many claim that it doesn’t apply in quite the same way.

Diagram from a tutorial PDF on light-measurement.com showing a virtual point source behind the bulb of a torch.

A fresnel, according to John E. Clark on Cinematography.com, can still be treated as a point source, but that point source is actually located somewhere behind the lamp-head! It’s a virtual point source. (Light radiating from a distant point source has approximately parallel rays with consequently negligible fall-off, e.g. sunlight.) So if this virtual source is 10m behind the fixture, then moving the lamp from 1m from the subject to 2m is not doubling the distance (and therefore not quartering the intensity). In fact it is multiplying the distance by 1.09 (12÷11=1.09), so the light would only drop to 84% of its former intensity (1÷1.09²=0.84).

I tried to confirm this using the Arri Photometrics App, but the data it gives for Arri’s fresnel fixtures conforms perfectly with an ordinary point source under the Law, leaving me somewhat confused. However, I did find some data for LED fresnels that broke the Law, for example the Lumi Studio 300:

As you can see, at full flood (bottom graphic) the Law is obeyed as expected; the 8m intensity of 2,500 lux is a quarter of the 4m intensity of 10,000 lux. But when spotted (top graphic) it falls off more rapidly. Again, very confusing, as I was expecting it to fall off less rapidly if the rays are diverging but close to parallel.

A more rapid fall-off suggests a virtual point source somewhere in front of the lamp-head. This was mentioned in several places on the fora as well. The light is converging, so the intensity increases as you move further from the fixture, reaching a maximum at the focal point, then diverging again from that point as per the Inverse Square Law. In fact, reverse-engineering the above data using the Law tells me – if my maths is correct – that the focal point is 1.93m in front of the fixture. Or, to put it another way, spotting this fixture is equivalent to moving it almost 2m closer to the subject. However, this doesn’t seem to tally with the beam spread data in the above graphics. More confusion!

I decided to look up ETC’s Source Four photometrics, since these units contain an ellipsoidal reflector which should focus the light (and therefore create a virtual point source) in front of themselves. However, the data shows no deviation from the Law and no evidence of a virtual point source displaced from the actual source.

 

I fought the law and the law won

I fear this investigation has left me more confused than when I started! Clearly there are factors at work here beyond what I’ve considered.

However, I’ve learnt that the Inverse Square Law is a useful means of estimating light fall-off for most lighting fixtures – even those that really seem like they should act differently! If you double the distance from lamp to subject, you’re usually going to quarter the intensity, or near as damn it. And that rule of thumb is all we cinematographers need 99% of the time. If in doubt, refer to photometrics data like that linked above.

And if anyone out there can shed any light (haha) on the confusion, I’d be very happy to hear from you!

The Inverse Square Law