5 Things You Didn’t Know About the Iris in Your Lens

Inside a lens, amongst the various glass elements, is an ingenious mechanism which we call the iris. Just like your biological iris, it controls the amount of light passing through the pupil to form an image. I’ve written about the iris’s use to control exposure before, and its well-known side effect of controlling depth of field. But here are five things that aren’t so commonly known about irises.

 

1. f-stops and the entrance pupil

This image shows the exit pupil because it’s seen through the rear element of the lens. A view through the front element would show the entrance pupil.

The f-number of a lens is the ratio of the focal length to the diameter of the aperture, but did you know that it isn’t the actual diameter of the aperture that’s used in this calculation? It’s the apparent diameter as viewed through the front of the lens. A lens might have a magnifying front element, causing the aperture to appear larger than its physical size, or a reducing one, causing it to appear smaller. Either way, it’s this apparent aperture – known as the entrance pupil – which is used to find the f-number.

 

2. No-parallax point

The no-parallax point of a lens is located at its entrance pupil. Sometimes called the nodal point, although that’s technically something different, this is the point around which the camera must pan and tilt if you want to eliminate all parallax. This is important for forced perspective work, for panoramas stitched together from multiple shots, and other types of VFX.

 

3. Focus

If you need to check your focal distance with a tape measure, many cameras have a handy Phi symbol on the side indicating where the sensor plane is located so that you can measure from that point. But technically you should be measuring to the entrance pupil. The sensor plane marker is just a convenient shortcut because the entrance pupil is in a different place for every lens and changes when the lens is refocused or zoomed. In most cases the depth of field is large enough for the shortcut to give perfectly acceptable results, however.

 

4. Bokeh shape

The bokeh of a 32mm Cooke S4 wide open at T2 (left) and stopped down to T2.8 (right). Note also the diffraction spikes visible in the righthand image.

The shape of the entrance pupil determines the shape of the image’s bokeh (out of focus areas), most noticeable in small highlights such as background fairy lights. The pupil’s shape is determined both by the number of iris blades and the shape of their edges. The edges are often curved to approximate a circle when the iris is wide open, but form more of a polygon when stopped down. For example, a Cooke S4 produces octagonal bokeh at most aperture settings, indicating eight iris blades. Incidentally, an anamorphic lens has a roughly circular aperture like any other lens, but the entrance pupil (and hence the bokeh) is typically oval because of the anamorphosing effect of the front elements.

 

5. Diffraction spikes

When the edge of an iris blade is straight or roughly straight, it spreads out the light in a perpendicular direction, creating a diffraction spike. The result is a star pattern around bright lights, typically most visible at high f-stops. Every blade produces a pair of spikes in opposite directions, so the number of points in the star is equal to twice the number of iris blades – as long as that number is odd. If the number of blades is even, diffraction spikes from opposite sides of the iris overlap, so the number of apparent spikes is the same as the number of blades, as in the eight-pointed Cooke diffraction pictured above right.

5 Things You Didn’t Know About the Iris in Your Lens

Newton Thomas Sigel on the Cinematography of “Bohemian Rhapsody”

The following article originally appeared on RedShark News in 2018.

Directed by Bryan Singer, of X-Men and The Usual Suspects fame, Bohemian Rhapsody charts the story of Queen from their formation in 1970 to their triumphant Live Aid set in 1985, with plenty of their classic rock hits along the way. Rami Malek (from Amazon’s Mr. Robot) turns in an Oscar-winning performance as larger-than-life frontman Freddie Mercury.

In his tenth collaboration with Singer was director of photography Newton Thomas Sigel, ASC. I spoke to Sigel about how he approached evoking an era, recreating the concerts, and lensing a legend.

“Every day was this wonderful trip back in time,” enthuses Sigel, who saw the movie as a chance to relive his own youth. “I love shooting music. There is this wonderful transition from the end of the counter-culture, through glam-rock into the hedonism of the eighties.”

Shooting digitally, Sigel employed both the Alexa SXT and Alexa 65. “I decided the movie needed to have a visual arc that best represented the band’s transition from idealists to rock stars, and all the issues that creates. To that effect, I did the first act with old Cooke Speed Panchro lenses on the Alexa SXT. As Queen is discovered, and begins to be known on the international stage, we transition to the Alexa 65.” Sigel later fine-tuned this arc during grading.

The cinematographer paired the large-format Alexa 65 with Prime DNA and Prime 65-S glass, testing all the lenses to find the ones with the most gentle fall-off in focus. “Each lens had its own personality, and was never really ‘perfect’. Our 28mm had a particularly crazy quality that, when used sparingly, had great effect.”

One thing that struck me immediately about the cinematography is the distinctly un-British, warm and glowing look, with lots of sun streaming through windows. This was all part of Sigel’s plan, which develops as the film progresses. “What begins as warm and golden, with its own special LUT, grows ever sharper and cooler, even desaturated,” he explains. “The beginning is all handheld and grainy, the rest much cleaner, with the camera on Steadicam and crane.”

Sigel took a down-to-earth approach to photographing Malek’s Mercury. “I always wanted Freddie to feel very real,” he states. “It is important that you sense his vulnerability at the same time as he is projecting the bravado of the consummate showman. Like so many great performers, Freddie exuded confidence and brashness on stage, and yet, had a terribly shy insecurity in ‘real’ life.”

The highlights of Bohemian Rhapsody are undoubtedly the concert scenes. To tackle these, Sigel began by watching every single piece of Queen footage he could lay his hands on, noting the development of the stage lighting over the years. “I wanted to be as faithful to that as I could, while still having it service our story,” he says. That meant eschewing the easily-coloured RGB LED fixtures so common in movies and concerts today, and going back to the traditional method of laboriously changing gels on tungsten units. “We stuck to period lights,” Sigel confirms, “predominantly par cans and follow spots.”

The sheer number of concert scenes was a challenge for the filmmakers, who at one point had to shoot four gigs in just two days. “We had so many concerts to shoot and so little time, I needed to develop a system to quickly change from one venue to the next,” Sigel recalls. “Because Queen’s lighting was based on large racks of par cans, we were able to construct a very modular system that would allow us to raise or lower different sections very quickly. By pre-programming lighting sequences, we could also create sequence patterns with different configurations of light pods to make it look like a different venue.”

The types of units change as the story progresses through the band’s career. “By the late 70s, Queen was among the first bands to adopt the Vari-Lite, which was championed by the band Genesis,” Sigel explains. “That opened up many more possibilities in the theatrical lighting, which also reflected the band’s ascendancy to the upper echelons of the rock world.”

Sigel notes that he embraced all opportunities to capture lens flares from the concert lighting. “There is a great moment during Live Aid where Freddie makes this sweeping gesture through a circular flare, and it almost seems as if he is drawing on the lens.”

The historic Live Aid concert forms the jubilant climax of the film. Queen’s entire 20-minute set was recreated over seven days of shooting. “We photographed it in every type of weather Great Britain has ever seen: rain, sun, overcast, front-light, backlight – you name it. We couldn’t afford to silk the area as I would have liked,” Sigel adds, referring to large sheets of diffusion hung from cranes to maintain a soft, consistent daylight. “So it was a constant battle and the DI [digital intermediate] certainly helped.”

Filming for Bohemian Rhapsody began in autumn 2017, but by December trouble was brewing. Twentieth Century Fox halted production for a while, with the Hollywood Reporter citing the “unexpected unavailability of director Bryan Singer” as the reason. Dexter Fletcher, who had been attached to direct the nascent film back in 2013, ultimately replaced Singer for the last leg of photography.

“A change like that is never ideal,” admits Sigel, “but Dexter was very impressed with what we had done so far. With only a couple weeks to go, he was happy to carry on in the direction we had begun. Obviously he brought some of his own personal touches, but what I noticed the most was the ease he had in communicating with the actors.”

Reflecting on his long history of collaboration with Singer, Sigel is very positive. “When you have done that many movies together, there is a shorthand that develops and makes much of the work easier because you know your parameters from the beginning. Bohemian Rhapsody was truly the ‘labour of love’ cliché for so many people involved; it was quite remarkable.”

Asked to sum up the appeal of Bohemian Rhapsody, the cinematographer declares, “The film has everything – a deep emotional core at the centre of what is otherwise an exuberant celebration of Queen’s music. I also think Freddie’s story of an immigrant outsider just trying to fit in has a resonance today that is very profound.”

Newton Thomas Sigel on the Cinematography of “Bohemian Rhapsody”

“Harvey Greenfield is Running Late”: Week 1

Day 1

The weather was dry and overcast, shedding a pleasantly soft light on the proceedings as the crew of Harvey Greenfield is Running Late set up for our first scene, in front of a small primary school in rural Cambridgeshire.

Then we started shooting and the weather went bananas.

One moment we had bright sunshine, the next we had heavy rain bordering on hail… sometimes in the same take. We had lots of fun and games dodging the showers, maneouvering a 12×12′ silk to soften the sun, keeping reflections and shadows out of shot, waiting for noisy trains to pass, and trying to get through takes without the light changing. But we got there in the end.

In the afternoon we moved into the school hall, which we were using as a makeshift studio. As well as numerous flashbacks, the film includes several imaginary sequences, including a spoof advert. This we shot against a black backdrop using dual backlights, one on either side, to highlight the talent. I totally stole this look from the Men in Black poster.

Our last shot of the day was Harvey’s first, and another imaginary scene, this time set in a coffin. To give the appearance of it being underground, the coffin (with no lid and one side missing) was placed on rostra with a black drape hanging below it. To create darkness above it, we simply set a flag in front of camera. Harvey (Paul Richards) lights a match to illuminate himself, which gaffer Stephen Allwright supplemented with two 1×1′ Aladdin Bi-flexes set to tungsten and gelled even more orange.

 

Day 2

One of the few occasions in my life when I’ve been able to walk to set from home: we started at the University Arms Hotel overlooking Parker’s Piece, one of Cambridge’s many green spaces (and, fact fans, the place where the rules of Association Football were first established).

The hotel’s function room was dressed as an upmarket restaurant, where we captured Harvey’s first date with his girlfriend Alice (Liz Todd). We shot towards a window; putting your main light source in the background is always a good move, and it gave us the perfect excuse to do soft cross-backlight on the two characters. The room’s wood panelling and sconces looked great on camera too.

The unit then moved to Emmaus, a large charity shop north of the city, where we filmed a Wall of Pants and some tightly choreographed Sandwich Action. Here we broke out the Astera tubes for the fist time, using them as a toppy, fluorescent-style key-light and backlight.

By now we were getting into the visual rhythm of the film, embracing wide angles (our 18-35mm zoom gets heavy use), central framing (or sometimes short-siding), Wes Anderson-type pans/tilts, and a 14mm lens and/or handheld moves for crazier moments.

 

Day 3

We were based at Paul’s house for day 3, beginning in the street outside for a brief scene in his car. Shooting from the back, we mounted an Aladdin in the passenger seat to key Paul, and blacked out some of the rear windows to create negative fill, much like I did for the driving scenes in Above the Clouds.

The rest of the day was spent in and around Paul’s shed. Or, to be more specific, the middle one of his three sheds. This is Harvey’s “Happy Place” so I stepped up from the Soft FX 0.5 filter I’d been shooting with so far to the Soft FX 1, to diffuse the image a little more. We also used haze for the only time on the film.

Some shots through the shed window gave us the usual reflection challenges. Stephen rigged a 12×12 black solid to help with this, and we draped some bolton over the camera. Inside the shed we used an Aladdin to bring up the level, and once we stopped shooting through the window we fired a tungsten 2K in through there instead. This was gelled with just half CTB so that it would still be warm compared with the daylight, and Stephen swapped the solid for a silk to keep the natural light consistent and eliminate the real direct sun.

I made my first use of the Red Gemini’s low light mode today, switching to ISO 3200 to maintain the depth of field when filming in slow motion. (I have been shooting at T4-5.6 because a sharper, busier background feels more stressful for Harvey.)

 

Day 4

Back to the primary school. We spent the morning outside shooting flashbacks with some talented child actors from the Pauline Quirke Academy. We got some nice slider shots and comedy pans while dealing with the ever-changing cloud cover.

Inside in the afternoon we picked up a dropped scene from day 1, then moved on to one of the film’s biggest challenges: a six-minute dialogue scene travelling through a corridor and around a classroom, to be filmed in a single continuous Steadicam shot. This could easily have been a nightmare, but a number of factors worked in our favour. Firstly, we had rehearsed the scene on location with actors and a phone camera during pre-production. Secondly, we had the brilliant Rupert Peddle operating the Steadicam. Thirdly, it would have been so difficult to keep a boom and its shadows out of shot that mixer Filipe Pinheiro and his team didn’t even try, instead relying on lavaliers and a mic mounted on the camera.

For similar reasons, we didn’t do much lighting either; there were almost no areas of the rooms and their ceilings that didn’t come into shot at some point. In two places Stephen rigged blackout for negative fill. I then chose which of the existing ceiling lights to turn off and which to keep on, to get as much shape into the image as possible. We tried to rig a grid onto one of the ceiling lights to take it off a wall that was getting too hot, but after one take we realised that this was in frame, so instead we stuck a square of ND gel to it. We also rigged two Astera tubes in the corridor, but discovered that one of those came into frame too, so in the end a single Astera tube was the only additive lighting. The existing ceiling lights worked particularly well for a slow push-in to Alice near the end of the shot, providing her with both key and backlight from perfect angles.

 

Day 5

Today we shot a big scene based around a school play. Production designer Amanda Stekly had created a suitably cheesy, sparkly backdrop, and more PQA students dressed up in weird and wonderful costumes to enact snatches of a very random production called Spamlet (making it the second time this year I’ve shot “to be or not to be”, though this time was… er… a little different).

The school had a basic lighting rig already. We refocused and re-gelled some of the lights, keeping it very simple and frontal. Behind the set I put one of my old 800W Arri Lites as a backlight for the kids on stage. To one side, where Alice was standing, we used two Astera tubes, one to key her and one to backlight her. These were both set to a cool, slightly minty colour. My idea of using green for calming characters and moments hasn’t come to fruition quite as I’d planned, because it hasn’t fitted the locations and other design elements, but there’s a little hint of it here.

For the audience, Stephen rigged an Aputure 300D to the ceiling as a backlight, then we bounced the stage lighting back onto them using a silver board. We also used the school’s follow spot, which gave us some nice flares for the stressful moments later in the scene. It was daytime both in reality and in the story, but we closed the (thin) curtains and reduced the ambience outside with floppy flags so that the artificial lighting would have more effect.

We had to move at breakneck speed in the afternoon to get everything in the can before wrap time, but we managed it, finishing our first week on schedule. No mean feat.

“Harvey Greenfield is Running Late”: Week 1

Pre-production for “Harvey Greenfield is Running Late”

Next week filming commences for Harvey Greenfield is Running Late, a comedy feature based on the critically acclaimed one-man play by Paul Richards. Paul reprises the title role in the film, directed by Jonnie Howard, who I previously worked with on A Cliché for the End of the World and The Knowledge.

The production is based locally to me in Cambridgeshire, and over the last couple of months I’ve attended recces, rehearsals and meetings. I’ve tried to approach it the same way I did Hamlet, reading each draft of the script carefully and creating a spreadsheet breakdown. Scene by scene, the breakdown lists my ideas for camerawork and lighting.

Harvey is a stressed and neurotic character who can’t say no to anything. The film takes place over a single day of his life when he finds himself having to attend a wedding, a funeral, a big meeting at his office, a school play and an appointment at a garage. Numerous scenes see him jogging from commitment to commitment (always running late in more ways than one) while taking phone calls that only add to the pressure. In the finest tradition of Alfie, Ferris Bueller and Fleabag, he also talks to camera.

Talking of finest traditions, the budget is very low but ambitions are high! With 100 script pages and 14 days the shoot will be more of a sprint than a marathon.

The UK film and TV industry is busier at present than I’ve ever known it, making up for lost time last year, so sourcing crew and kit has certainly been challenging. But thanks to generous sponsorship by Global Distribution and Sigma we will be shooting on a Red Ranger Gemini – which regular readers may recall I almost selected for Hamlet – with Sigma Cine primes and zooms. I will be working with a completely new camera team and gaffer.

One of the first things Jonnie told me was that he wanted to use a lot of wide lenses. This makes a lot of sense for the story. Wide lenses fill the background with more clutter, making the frame busier and more stressful for Harvey. They also put us into Harvey’s headspace by forcing the camera physically close to get a tighter shot. We shot some tests early on with Paul, primarily on the Sigma Cine 14mm, to start getting a feel for that look.

Influences include Woody Allen, the Coen brothers, Wes Anderson, Terry Gilliam and Napoleon Dynamite, and as usual, watching reference films has formed an important part of prep for me.

Based on the colour palette Nicole Stone has put together for her costumes, I’ve decided to use orange as Harvey’s stress colour and green when he’s calmer. For most of the film this will just be a case of framing in orange or green elements when appropriate, or putting a splash of the relevant colour in the background. For key scenes later in the story we may go so far as to bathe Harvey in the colour.

Right, I’d better get back to trying to sort out the lighting kit hire, which is still up in the air. Possibly this post should have been called Pre-production for “Harvey Greenfield” is running late.

Pre-production for “Harvey Greenfield is Running Late”

Using Depth of Field Creatively

“The Handmaid’s Tale: Offred” (2017, DP: Colin Watkinson, ASC, BSC)

When DSLR video exploded onto the indie filmmaking scene a decade ago, film festivals were soon awash with shorts with ultra-blurry backgrounds. Now that we have some distance from that first novelty of large-sensor cinematography we can think more intelligently about how depth of field – be it shallow or deep – is best used to help tell our stories.

First, let’s recap the basics. Depth of field is the distance between the nearest and farthest points from camera that are in focus. The smaller the depth of field, the less the subject has to move before they go out of focus, and the blurrier any background and foreground objects appear. On the other hand, a very large depth of field may make everything from the foreground to infinity acceptably sharp.

Depth of field varying with aperture
Everyone’s favourite time machine at f/5 (left) and f/1.8 (right)

Depth of field is affected by four things: sensor (or film) size, focal length (i.e. lens length), focal distance, and aperture. In the days of tiny Mini-DV sensors, I was often asked by a director to zoom in (increase the focal length) to decrease the depth of field, but sometimes that was counter-productive because it meant moving the camera physically further away, thus increasing the focal distance, thus increasing the depth of field.

It was the large 35mm sensors of DSLRs, compared with the smaller 1/3” or 2/3” chips of traditional video cameras, that made them so popular with filmmakers. Suddenly the shallow depth of field seen in a Super-35 movie could be achieved on a micro-budget. It is worth noting for the purists, however, that a larger sensor technically makes for a deeper depth of field. The shallower depth of field associated with larger sensors is actually a product of the longer lenses required to obtain the same field of view.

Once a camera is selected and filming is underway, aperture is the main tool that DPs tend to use to control depth of field. A small aperture (large f- or T-number) gives a large depth of field; a large aperture (small f- or T-number) gives a narrow depth of field. What all those early DSLR filmmakers, high on bokeh, failed to notice is that aperture is, and always has been, a creative choice. Plenty of directors and DPs throughout the history of cinema have chosen deep focus when they felt it was the best way of telling their particular story.

“Citizen Kane” (1941, DP: Gregg Toland, ASC)

One of the most famous deep-focus films is 1941’s Citizen Kane, frequently voted the greatest movie ever made. First-time director Orson Welles came from a theatre background, and instructed DP Gregg Toland to keep everything in focus so that the audience could choose what to look at just as they could in a theatre. “What if they don’t look at what they’re supposed to look at?” Welles was apparently asked. “If that happens, I would be a very bad director,” was his reply.

Stanley Kubrick was also fond of crisp backgrounds. The infamous f/0.7 NASA lenses used for the candlelight scenes in Barry Lyndon were a rare and extreme exception borne of low-light necessity. A typical Kubrick shot has a formal, symmetrical composition with a single-point perspective and everything in focus right into the distance. Take the barracks in Full Metal Jacket, for example, where the background soldiers are just as sharp as the foreground ones. Like Welles, Kubrick’s reasons may have lain in a desire to emulate traditional art-forms, in this case paintings, where nothing is ever blurry.

“Full Metal Jacket” (1987, DP: Douglas Milsome, ASC, BSC)

The Indiana Jones trilogy was shot at a surprisingly slow stop by the late, great Douglas Slocombe. “I prefer to work in the aperture range of T14-T14.5 when I am shooting an anamorphic film like Raiders,” he said at the time. “The feeling of depth contributed to the look.” Janusz Kamiński continued that deep-focus look, shooting at T8-T11 when he inherited the franchise for Kingdom of the Crystal Skull.

At the other end of the aperture scale, the current Hulu series The Handmaid’s Tale makes great creative use of a shallow depth of field, creating a private world for the oppressed protagonist which works in tandem with voiceovers to put the viewer inside her head, the only place where she is free.

A director called James Reynolds had a similar idea in mind when I shot his short film, Exile Incessant. He wanted to photograph closed-minded characters with shallow focus, and show the more tolerant characters in deep focus, symbolising their openness and connection with the world. (Unfortunately the tiny lighting budget made deep focus impossible, so we instead achieved the symbolism by varying the harshness of the lighting.)

“Ren: The Girl with the Mark” (2016, DP: Neil Oseman)

One production where I did vary the depth of field was Ren: The Girl with the Mark, where I chose f/4 as my standard working stop, but reduced it to as little as f/1.4 when the lead character was bonding with the mysterious spirit inside her. It was the same principle again of separating the subject from the world around her.

Depth of field is a fantastic creative tool, and one which we are lucky to have so much control over with today’s cameras. But it will always be most effective when it’s used expressively, not just aesthetically.

Using Depth of Field Creatively

6 Things to Beware of with Vintage Lenses

Ever since digital cinematography became the norm, DPs have sought to counter the format’s perfection with characterful vintage lenses. Having just completed a feature film shoot, Hamlet, on Cooke Panchros and a Cooke 10:1 Varotal, I’m over the moon with the beautiful, creamy, organic look they brought to the production. However, I can’t deny that they have some disadvantages over modern glass which you should take into consideration before choosing the vintage approach.

 

1. Softness

Vintage lenses simply aren’t as sharp as their modern counterparts, particularly at the edges of frame and particularly when the iris is wide open. On Hamlet I deliberately shot with the Panchros wide open to soften the image, rather than adding a diffusion filter like I’ve often done in the past, but that look is not for everyone, and it does make things a little harder for your focus puller. Be sure to test the sharpness and view the results on a large screen before committing.

 

2. BreathING

Breathe is the phenomenon whereby a lens appears to zoom slightly in or out when the focus is pulled. The Cooke Varotal is especially prone to this. As a result, my focus puller Aristide Russo had to be very gentle with his pulls otherwise the breathing was distracting.

 

3. Veiling

Many DPs love lens flares, and beautiful, natural flares were one of the reasons I picked the vintage Cooke glass. But look out for veiling flare – a milkiness and lift in the shadows affecting the whole frame. I noticed this a lot when shooting under the practical fluorescents in Hamlet‘s stage set, especially with handheld shots where the veiling would appear and disappear depending on the camera’s angle to the lights. I decided to embrace it and make it part of the film’s look, but if maintaining high contrast at all times is important to you, lenses without modern coatings may not be the right choice.

 

4. Vignetting

Check for dark patches in the corners of your image. The Varotal I used vignetted at certain parts of the zoom range and not at others, so the dark corners would appear and disappear during a zoom. Although not ideal, it isn’t noticeable most of the time. Besides, I figured that most colourists add vignettes to most shots anyway, so I was simply saving them a little time!

 

5. Mechanics

Older lenses are, quite naturally, less reliable. Even if they have been rehoused, like our Cooke “Century” Panchros had been in 2000, you may find that the iris and/or focus sticks sometimes. Our 25mm started to play up halfway through our shoot, forcing Aris to use the rosettes to support the matte box, otherwise the motor wasn’t powerful enough to turn the focus ring. This possibility was flagged for me during testing when we had a similar issue with the 50mm. Even if all your lenses seem to be fine during prep, know that a vintage lens could start misbehaving at any time, and your rental house may not have another on the shelf to replace it with.

 

6. Uniformity

Don’t expect a set of vintage primes to all have the same maximum aperture or the same external configuration. The iris ring might be buried in the matte box, the matte box might not fit on at all, or it may be impossible to engage both iris and focus motors at the same time.

 

All this sounds quite negative, but the flares, softness, breathing and vignettes can be absolutely beautiful. Be aware of the downsides of using vintage glass, absolutely, but if they suit your story then embrace the flaws and get ready to be blown away by your dailies.

In case you missed them the first time, I’ll leave you with some highlights from my Hamlet lens tests.

6 Things to Beware of with Vintage Lenses

Undisclosed Project: Experimentation

The main event of last week’s prep was a test at Panavision of the Arri Alexa XT, Red Gemini and Sony F55, along with Cooke Panchro, Cooke Varotal, Zeiss Superspeed and Angenieux glass. More on that below, along with footage.

The week started with Zoom meetings with the costume designer, the make-up artist, potential fight choeographers and a theatrical lighting designer. The latter is handling a number of scenes which take place on a stage, which is a new and exciting collaboration for me. I met with her at the location the next day, along with the gaffer and best boy. After discussing the stage scenes and what extra sources we might need – even as some of them were starting to be rigged – I left the lighting designer to it. The rest of us then toured the various rooms of the location, with the best boy making notes and lighting plans on his tablet as the gaffer and I discussed them. They also took measurements and worked out what distro they would need, delivering a lighting kit list to production the next day.

Meanwhile, at the request of the producer, I began a shot list, beginning with two logistically complex scenes. Despite all the recces so far, I’ve not thought about shots as much as you might think, except where they are specified in the script or where they jumped out at me when viewing the location. I expect that much of the shot planning will be done during the rehearsals, using Artemis Pro. That’s much better and easier than sitting at home trying to imagine things, but it’s useful for other departments to be able to see a shot list as early as possible.

So, the camera tests. I knew all along that I wanted to test multiple cameras and lenses to find the right ones for this project, a practice that is common on features but which, for one reason and another, I’ve never had a proper chance to do before. So I was very excited to spend Wednesday at Panavision, not far from my old stomping ground in Perivale, playing around with expensive equipment.

Specifically we had: an Arri Alexa – a camera I’m very familiar with, and my gut instinct for shooting this project on; a Sony F55 – which I was curious to test because it was used to shoot the beautiful Outlander series; and a Red Gemini – because I haven’t used a Red in years and I wanted to check I wasn’t missing out on something awesome.

For lenses we had: a set of Cooke Panchros – again a gut instinct (I’ve never used them, but from what I’ve read they seemed to fit); a set of Zeiss Superspeeds – selected after reviewing my 2017 test footage from Arri Rental; a couple of Cooke Varotal zooms, and the equivalents by the ever-reliable Angenieux. Other than the Angenieux we used on the B-camera for The Little Mermaid (which I don’t think we ever zoomed during a take), I’ve not used cinema zooms before, but I want the old-fashioned look for this project.

Here are the edited highlights from the tests…

You’ll notice that the Sony F55 disappears from the video quite early on. This is because, although I quite liked the camera on the day, as soon as I looked at the images side by side I could see that the Sony was significantly softer than the other two.

So it was down to the Alexa vs. the Gemini, and the Cookes vs. the Superspeeds. I spent most of Thursday and all of Friday morning playing with the footage in DaVinci Resolve, trying to decide between these two pairs of very close contenders. I tried various LUTs, did some rough grading (very badly, because I’m not a colourist), tested how far I could brighten the footage before it broke down, and examined flares and bokeh obsessively.

Ultimately I chose the Cooke Panchros because (a) they have a beautiful and very natural-looking flare pattern, (b) the bokeh has a slight glow to it which I like, (c) the bokeh remains a nice shape when stopped down, unlike the Superspeeds’, which goes a bit geometric, (d) they seem sharper than the Superspeeds at the edges of frame when wide open, and (e) more lengths are available.

As for the zoom lenses (not included in the video), the Cooke and the Angenieux were very similar indeed. I chose the former because it focuses a little closer and the bokeh again has that nice glow.

I came very close to picking the Gemini as my camera. I think you’d have to say, objectively, it produces a better image than the Alexa, heretical as that may sound. The colours seem more realistic (although we didn’t shoot a colour chart, which was a major oversight) and it grades extremely well. But…

I’m not making a documentary. I want a cinematic look, and while the Gemini is by no means un-cinematic, the Alexa was clearly engineered by people who loved the look of film and strove to recreate it. When comparing the footage with the Godfather and Fanny and Alexander screen-grabs that are the touchstone of the look I want to create, the Alexa was just a little bit closer. My familiarity and comfort level with the Alexa was a factor too, and the ACs felt the same way.

I’m very glad to have tested the Gemini though, and next time I’m called upon to shoot something great and deliver in 4K (not a requirement on this project) I will know exactly where to turn. A couple of interesting things I learnt about it are: (1) whichever resolution (and concomitant crop factor) you select, you can record a down-scaled 2K ProRes file, and this goes for the Helium too; (2) 4K gives the Super-35 field of view, whereas 5K shows more, resulting in some lenses vignetting at this resolution.

Undisclosed Project: Experimentation

Exposure Part 2: Neutral Density (ND) Filters

In the first part of this series, I explained the concepts of f-stops and T-stops, and looked at how aperture can be used to control exposure. We saw that changing the aperture causes side effects, most noticeably altering the depth of field.

How can we set the correct exposure without compromising our depth of field? Well, as we’ll see later in this series, we can adjust the shutter angle and/or ISO, but both of those have their own side effects. More commonly a DP will use neutral density (ND) filters to control the amount of light reaching the lens. These filters get their name from the fact that they block all wavelengths of light equally, so they darken the image without affecting the colour.

 

When to use an ND Filter

Let’s look at an example. Imagine that I want to shoot at T4; this aperture gives a nice depth of field, on the shallow side but not excessively so. My subject is very close to a bright window and my incident light meter is giving me a reading of f/11. (Although I’m aiming for a T-stop rather an f-stop, I can still use the f-number my meter gives me; in fact if my lens were marked in f-stops then my exposure would be slightly off because the meter does not know the transmission efficiency of my lens.) Let’s remind ourselves of the f-stop/T-stop series before we go any further:

1      1.4      2      2.8      4      5.6      8      11      16      22     32

By looking at this series, which can be found printed on any lens barrel or permanently displayed on a light meter’s screen, I can see that f/11 (or T11) is three stops down from f/4 (or T4) – because 11 is three numbers to the right of 4 in the series. To achieve correct exposure at T4 I’ll need to cut three stops of light. I can often be seen on set counting the stops like this on my light meter or on my fingers. It is of course possible to work it out mathematically or with an app, but that’s not usually necessary. You quickly memorise the series of stops with practice.

 

What Strength of filter to choose

Some ND filters are marked in stops, so I could simply select a 3-stop ND and slide it into my matte box or screw it onto my lens. Other times – the built-in ND filters on the Sony FS7, for example – they’re defined by the fraction of light they let through. So the FS7’s 1/4 ND cuts two stops; the first stop halves the light – as we saw in part of one of this series – and the second stop halves it again, leaving us a quarter of the original amount. The 1/16 setting cuts four stops.

However, most commonly, ND filters are labelled in optical density. A popular range of ND filters amongst professional cinematographers are those made by Tiffen, and a typical set might be labelled as follows:

.3      .6      .9      1.2

That’s the optical density, a property defined as the natural logarithm of the ratio of the quantity of light entering the filter to the quantity of light exiting it on the other side. A .3 ND reduces the light by half because 10 raised to the power of -0.3 is about 0.5, and reducing light by half, as we’ve previously established, means dropping one stop.

If that maths is a bit much for you, don’t worry. All you really need to do is multiply the number of stops you want to cut by 0.3 to find the filter you need. So, going back to my example with the bright window, to get from T11 to T4, i.e. to cut three stops, I’ll pick the .9 ND.

It’s far from intuitive at first, but once you get your head around it, and memorise the f-stops, it’s not too difficult. Trust me!

Here are a couple more examples:

  • Light meter reads f/8 and you want to shoot at T5.6. That’s a one stop difference. (5.6 and 8 are right next to each other in the stop series, as you’ll see if you scroll back to the top.) 1 x 0.3 = 0.3 so you should use the .3 ND.
  • Light meter reads f/22 and you want to shoot at T2.8. That’s a six stop difference (scroll back up and count them), and 6 x 0.3 = 1.8, so you need a 1.8 ND filter. If you don’t have one, you need to stack two NDs in your matte box that add up to 1.8, e.g. a 1.2 and a .6.

 

Variations on a Theme

Variable ND filters are also available. These consist of two polarising filters which can be rotated against each other to progressively lighten or darken the image. They’re great for shooting guerilla-style with a small crew. You can set your iris where you want it for depth of field, then expose the image by eye simply by turning the filter. On the down side, they’re hard to use with a light meter because there is often little correspondence between the markings on the filter and stops. They can also have a subtle adverse effect on skin tones, draining a person’s apparent vitality, as some of the light which reflects off human skin is polarised.

IR pollution increases with successively stronger ND filters (left to right) used on a Blackmagic Micro Cinema Camera. The blue dyes in this costume evidently reflect a large amount of IR.

Another issue to look out for with ND filters is infra-red (IR). Some filters cut only the visible wavelengths of light, allowing IR to pass through. Some digital sensors will interpret this IR as visible red, resulting in an image with a red colour cast which can be hard to grade out because different materials will be affected to different degrees. Special IR ND filters are available to eliminate this problem.

These caveats aside, ND filters are the best way to adjust exposure (downwards at least) without affecting the image in any other way.

In the next part of this series I’ll look at shutter angles, what they mean, how they affect exposure and what the side effects are.

Learn how to use ND filters practically with my Cinematic Lighting online couse. Enter voucher code INSTA90 for an amazing 90% off.

Exposure Part 2: Neutral Density (ND) Filters

Exposure Part 1: Aperture

This is the first in a series of posts where I will look in detail at the four means of controlling the brightness of a digital video image: aperture, neutral density (ND) filters, shutter angle and ISO. It is not uncommon for newer cinematographers to have only a partial understanding of these topics, enough to get by in most situations; that was certainly the case with me for many years. The aim of this series is to give you an understanding of the underlying mechanics which will enable you to make more informed creative decisions.

You can change any one of the four factors, or any combination of them, to reach your desired level of exposure. However, most of them will also affect the image in other ways; for example, aperture affects depth of field. One of the key responsibilities of the director of photography is to use each of the four factors not just to create the ideal exposure, but to make appropriate use of these “side effects” as well.

 

f-stops and t-stops

The most common way of altering exposure is to adjust the aperture, a.k.a. the iris, sometimes described as changing “the stop”. Just like the pupil in our eyes, the aperture of a photographic lens is a (roughly) circular opening which can be expanded or contracted to permit more or less light through to the sensor.

You will have seen a series of numbers like this printed on the sides of lenses:

1      1.4      2      2.8      4      5.6      8      11      16      22     32

These are ratios – ratios of the lens’ focal length to its iris diameter. So a 50mm lens with a 25mm diameter iris is at f/2. Other lengths of lens would have different iris diameters at f/2 (e.g. 10mm diameter for a 20mm lens) but they would all produce an image of the same brightness. That’s why we use f-stops to talk about iris rather than diameters.

But why not label a lens 1, 2, 3, 4…? Why 1, 1.2, 2, 2.8…? These magic numbers are f-stops. A lens set to f/1.4 will let in twice as much light as (or “one stop more than”) a lens set to f/2, which in turn will let in twice as much as one set to f/2.8, and so on. Conversely, a lens set to f/2.8 will let in half as much light as (or “one stop less than”) a lens set to f/2, and so on. (Note that a number between any of these f-stops, e.g. f/1.8, is properly called an f-number, but not an f-stop.) These doublings or halvings – technically known as a base-2 logarithmic scale – are a fundamental concept in exposure, and mimic our eyes’ response to light.

If you think back to high-school maths and the πr² squared formula for calculating the area of a circle from its radius, the reason for the seemingly random series of numbers will start to become clear. Letting in twice as much light requires twice as much area for those light rays to fall on, and remember that the f-number is the ratio of the focal length to the iris diameter, so you can see how square roots are going to get involved and why f-stops aren’t just plain old round numbers.

If you’re shooting with a cine lens, rather than a stills lens, you’ll see the same series of numbers on the barrel, but here they are T-stops rather than f-stops. T-stops are f-stops adjusted to compensate for the light transmission efficiency. Two different lenses set to, say, f/2 will not necessarily produce equally bright images, because some percentage of light travelling through the elements will always be lost, and that percentage will vary depending on the quality of the glass and the number of elements. A lens with 100% light transmission would have the same f-number and T-number, but in practice the T-number will always be a little bigger than the f-number. For example, Cooke’s 15-40mm zoom is rated at a maximum aperture of T2 or f/1.84.

 

Fast and slow lenses

When buying or renting a lens, one of the first things you will want to know is its maximum aperture. Lenses are often described as being fast (larger maximum aperture, denoted by a smaller f- or T-number like T1.4) or slow (smaller maximum aperture, denoted by a bigger f- or T-number like T4). These terms come from the fact that the shutter speed would need to be faster or slower to capture the same amount of light… but more on that later in the series.

Faster lenses are generally more expensive, but that expense may well be outweighed by the savings made on lighting equipment. Let’s take a simple example, and imagine an interview lit by a 4-bank Kino Flo and exposed at T2.8. If our lens can open one stop wider (known as stopping up) to T2 then we double the amount of light reaching the sensor. We can therefore halve the level of light – by turning off two of the Kino Flo’s tubes or by renting a cheaper 2-bank unit in the first place. If we can stop up further, to T1.4, then we only need one Kino tube to achieve the same exposure.

 

Side effects

One of the first things that budding cinematographers learn is that wider apertures make for a smaller depth of field, i.e. the range of distances within which a subject will be in focus is smaller. In simple terms, the background of the image is blurrier when the depth of field is shallower.

It is often tempting to go for the shallowest possible depth of field, because it feels more cinematic and helps conceal shortcomings in the production design, but that is not the right look for every story. A DP will often choose a stop to shoot at based on the depth of field they desire. That choice of stop may affect the entire lighting budget; if you want to shoot at a very slow T14 like Douglas Slocombe did for the Indiana Jones trilogy, you’re going to need several trucks full of lights!

There is another side effect of adjusting the aperture which is less obvious. Lenses are manufactured to perform best in the middle of their iris range. If you open a lens up to its maximum aperture or close it down to its minimum, the image will soften a little. Therefore another advantage of faster lenses is the ability to get further away from their maximum aperture (and poorest image quality) with the same amount of light.

Finally it is worth noting that the appearance of bokeh (out of focus areas) and lens flares also changes with aperture. The Cooke S4 range, for example, renders out-of-focus highlights as circles when wide open, but as octagons when stopped down. With all lenses, the star pattern seen around bright light sources will be stronger when the aperture is smaller. You should shoot tests – like these I conducted in 2017 – if these image artefacts are a critical part of your film’s look.

Next time we’ll look at how we can use ND filters to control exposure without compromising our choice of stop.

Learn how to use exposure practically with my Cinematic Lighting online couse. Enter voucher code INSTA90 for an amazing 90% off.

Exposure Part 1: Aperture

How is Dynamic Range Measured?

The high dynamic range of the ARRI Alexa Mini allowed me to retain all the sky detail in this shot from “Above the Clouds”.

Recently I’ve been pondering which camera to shoot an upcoming project on, so I consulted the ASC’s comparison chart. Amongst the many specs compared is dynamic range, and I noticed that the ARRI Alexa’s was given as 14+ stops, while the Blackmagic URSA’s is 15. Having used both cameras a fair bit, I can tell you that there’s no way in Hell that the Ursa has a higher dynamic range than the Alexa. So what’s going on here?

 

What is dynamic range?

To put it simply, dynamic range is the level of contrast that an imaging system can handle. To quote Alan Roberts, who we’ll come back to later:

This is normally calculated as the ratio of the exposure which just causes white clipping to the exposure level below which no details can be seen.

A photosite on a digital camera’s sensor outputs a voltage proportional to the amount of light hitting it, but at some point the voltage reaches a maximum, and no matter how much more light you add, it won’t change. At the other end of the scale, a photosite may receive so little light that it outputs no voltage, or at least nothing that’s discernible from the inherent electronic noise in the system. These upper and lower limits of brightness may be narrowed by image processing within the camera, with RAW recording usually retaining the full dynamic range, while linear Rec. 709 severely curtails it.

In photography and cinematography, we measure dynamic range in stops – doublings and halvings of light which I explain fully in this article. One stop is a ratio of 2:1, five stops are 32:1, thirteen stops are almost 10,000:1

It’s worth pausing here to point out the difference between dynamic range and latitude, a term which is sometimes regarded as synonymous, but it’s not. The latitude is a measure of how much the camera can be over- or under-exposed without losing any detail, and is dependent on both the dynamic range of the camera and the dynamic range of the scene. (A low-contrast scene will allow more latitude for incorrect exposure than a high-contrast scene.)

 

Problems of Measurement

Before digital cinema cameras were developed, video had a dynamic range of about seven stops. You could measure this relatively easily by shooting a greyscale chart and observing the waveform of the recorded image to see where the highlights levelled off and the shadows disappeared into the noise floor. With today’s dynamic ranges into double digits, simple charts are no longer practical, because you can’t manufacture white enough paper or black enough ink.

For his excellent video on dynamic range, Filmmaker IQ’s John Hess built a device fitted with a row of 1W LEDs, using layers of neutral density gel to make each one a stop darker than its neighbour. For the purposes of his demonstration, this works fine, but as Phil Rhodes points out on RedShark News, you start running into the issue of the dynamic range of the lens.

It may seem strange to think that a lens has dynamic range, and in the past when I’ve heard other DPs talk about certain glass being more or less contrasty, I admit that I haven’t thought much about what that means. What it means is flare, and not the good anamorphic streak kind, but the general veiling whereby a strong light shining into the lens will raise the overall brightness of the image as it bounces around the different elements. This lifts the shadows, producing a certain amount of milkiness. Even with high contrast lenses, ones which are less prone to veiling, the brightest light on your test device will cause some glare over the darkest one, when measuring the kind of dynamic range today’s cameras enjoy.

 

Manufacturer Measurements

Going back to my original query about the Alexa versus the URSA, let’s see exactly what the manufacturers say. ARRI specifically states that its sensor’s dynamic range is over 14 stops “as measured with the ARRI Dynamic Range Test Chart”. So what is this chart and how does it work? The official sales blurb runs thusly:

The ARRI DRTC-1 is a special test chart and analysis software for measurement of dynamic range and sensitivity of digital cameras. Through a unique stray light reduction concept this system is able to accurately measure up to 15.5 stops of dynamic range.

The “stray light reduction” is presumably to reduce the veiling mentioned earlier and provide more accurate results. This could be as simple as covering or turning off the brighter lights when measuring the dimmer ones.

I found a bit more information about the test chart in a 2011 camera shoot-out video, from that momentous time when digital was supplanting film as the cinematic acquisition format of choice. Rather than John Hess’s ND gel technique, the DRTC-1 opts for something else to regulate its light output, as ARRI’s Michael Bravin explains in the video:

There’s a piece of motion picture film behind it that’s checked with a densitometer, and what you do is you set the exposure for your camera, and where you lose detail in the vertical and horizontal lines is your clipping point, and where you lose detail because of noise in the shadow areas is your lowest exposure… and in between you end up finding the number of stops of dynamic range.

Blackmagic Design do not state how they measure the dynamic range of their cameras, but it may be a DSC Labs Xlya. This illuminated chart boasts a shutter system which “allows users to isolate and evaluate individual steps”, plus a “stepped xylophone shape” to minimise flare problems.

Art Adams, a cinema lens specialist at ARRI, and someone who’s frequently quoted in Blain Brown’s Cinematography: Theory & Practice, told Y.M. Cinema Magazine:

I used to do a lot of consulting with DSC Labs, who make camera test charts, so I own a 20-stop dynamic range chart (DSC Labs Xyla). This is what most manufacturers use to test dynamic range (although not ARRI, because our engineers don’t feel it’s precise enough) and I see what companies claim as usable stops. You can see that they are just barely above the noise floor.

 

Conclusions

Obviously these ARRI folks I keep quoting may be biased. I wanted to find an independent test that measures both Blackmagics and Alexas with the same conditions and methodology, but I couldn’t find one. There is plenty of anecdotal evidence that Alexas have a bigger dynamic range, in fact that’s widely accepted as fact, but quantifying the difference is harder. The most solid thing I could find is this, from a 2017 article about the Blackmagic Ursa Mini 4.6K (first generation):

The camera was measured at just over 14 stops of dynamic range in RAW 4:1 [and 13 stops in ProRes]. This is a good result, especially considering the price of the camera. To put this into perspective Alan measured the Canon C300 mkII at 15 stops of dynamic range. Both the URSA Mini 4.6 and C300 mkII are bettered by the ARRI Alexa and Amira, but then that comes as no surprise given their reputation and price.

The Alan mentioned is Alan Roberts, something of a legend when it comes to testing cameras. It is interesting to note that he is one of the key players behind the TLCI (Television Lighting Consistency Index), a mooted replacement for CRI (Colour Rendering Index). It’s interesting because this whole dynamic range business is starting to remind me of my investigation into CRI, and is leading me to a similar conclusion, that the numbers which the manufacturers give you are all but useless in real-world cinematography.

Whereas CRI at least has a standardised test, there’s no such thing for dynamic range. Therefore, until there is more transparency from manufacturers about how they measure it, I’d recommend ignoring their published values. As always when choosing a camera, shoot your own tests if at all possible. Even the most reliable numbers can’t tell you whether you’re going to like a camera’s look or not, or whether it’s right for the story you want to tell.

When tests aren’t possible, and I know that’s often the case in low-budget land, at least try to find an independent comparison. I’ll leave you with this video from the Slanted Lens, which compares the URSA Mini Pro G2 with the ARRI Amira (which uses the same Alev III sensor as the Alexa). They don’t measure the dynamic range, but you can at least see the images side by side, and in the end it’s the images that matter, not the numbers.

How is Dynamic Range Measured?