Astera Titan Tubes seem to be everywhere at the moment, every gaffer and DP’s favourite tool. Resembling fluorescent tubes, Asteras are wireless, flicker-free LED batons comprised of 16 pixels which can be individually coloured, flashed and programmed from an app to produce a range of effects.
Here are five ways in which I used Titan Tubes on my most recent feature, Hamlet. I’m not being sponsored by Astera to write this. I just know that loads of people out there are using them and I thought it would be interesting to share my own experiences.
1. Substitute fluorescents
We had a lot of scenes with pre-existing practical fluorescents in them. Sometimes we gelled these with ND or a colour to get the look we wanted, but other times it was easier to remove the fluorescent tube and cable-tie an Astera into the housing. As long as the camera didn’t get too close you were never going to see the ties, and the light could now be altered with the tap of an app.
On other occasions, when we moved in for close-ups, the real fluorescents weren’t in an ideal position, so we would supplement or replace them with an Astera on a stand and match the colour.
2. Hidden behind corners
Orientated vertically, Asteras are easy to hide behind pillars and doorways. One of the rooms we shot in had quite a dark doorway into a narrow corridor. There was just enough space to put in a vertical pole-cat with a tube on it which would light up characters standing in the doorway without it being seen by the camera.
3. Eye light
Ben Millar, Hamlet‘s gaffer, frequently lay an Astera on the floor to simulate a bit of floor bounce and put a sparkle in the talent’s eye. On other occasions when our key light was coming in at a very sidey angle, we would put an Astera in a more frontal position, to ping the eyes again and to wrap the side light very slightly.
4. rigged to the ceiling
We had a scene in a bathroom that was all white tiles. It looked very flat with the extant overhead light on. Our solution was to put up a couple of pole-cats, at the tops of the two walls that the camera would be facing most, and hang Asteras horizontally from them. Being tubes they have a low profile so it wasn’t hard to keep them out of the top of frame. We put honeycombs on them and the result was that we always had soft, wrappy backlight with minimal illumination of the bright white tiles.
5. Special effects
One of the most powerful things about Titan Tubes is that you can programme them with your own special effects. When we needed a Northern Lights effect, best boy Connor Adams researched the phenomenon and programmed a pattern of shifting greens into two tubes rigged above the set.
On War of the Worlds in 2019 we used the Asteras’ emergency lights preset to pick up some close-ups which were meant to have a police car just out of shot.
What colour is moonlight? In cinema, the answer is often blue, but what is the reality? Where does the idea of blue moonlight come from? And how has the colour of cinematic moonlight evolved over the decades?
The science bit
According to universetoday.com the lunar surface “is mostly oxygen, silicon, magnesium, iron, calcium and aluminium”. These elements give the moon its colour: grey, as seen best in photographs from the Apollo missions and images taken from space.
When viewed from Earth, Rayleigh scattering by the atmosphere removes the bluer wavelengths of light. This is most noticeable when the moon is low in the sky, when the large amount of atmosphere that the light has to travel through turns the lunar disc quite red, just as with the sun, while at its zenith the moon merely looks yellow.
Yellow is literally the opposite (or complement) of blue, so where on (or off) Earth did this idea of blue cinematic moonlight come from?
One explanation is that, in low light, our vision comes from our rods, the most numerous type of receptor in the human retina (see my article “How Colour Works” for more on this). These cells are more sensitive to blue than any other colour. This doesn’t actually mean that things look blue in moonlight exactly, just that objects which reflect blue light are more visible than those that don’t.
In reality everything looks monochromatic under moonlight because there is only one type of rod, unlike the three types of cones (red, green and blue) which permit colour vision in brighter situations. I would personally describe moonlight as a fragile, silvery grey.
Blue moonlight on screen dates back to the early days of cinema, before colour cinematography was possible, but when enterprising producers were colour-tinting black-and-white films to get more bums on seats. The Complete Guide to Colour by Tom Fraser has this to say:
As an interesting example of the objectivity of colour, Western films were tinted blue to indicate nighttime, since our eyes detect mostly blue wavelengths in low light, but orange served the same function in films about the Far East, presumably in reference to the warm evening light there.
It’s entirely possible that that choice to tint night scenes blue has as much to do with our perception of blue as a cold colour as it does with the functioning of our rods. This perception in turn may come from the way our skin turns bluer when cold, due to reduced blood flow, and redder when hot. (We saw in my recent article on white balance that, when dealing with incandescence at least, bluer actually means hotter.)
Whatever the reason, by the time it became possible to shoot in colour, blue had lodged in the minds of filmmakers and moviegoers as a shorthand for night.
Early colour films often staged their night scenes during the day; DPs underexposed and fitted blue filters in their matte boxes to create the illusion. It is hard to say whether the blue filters were an honest effort to make the sunlight look like moonlight or simply a way of winking to the audience: “Remember those black-and-white films where blue tinting meant you were watching a night scene? Well, this is the same thing.”
Day-for-night fell out of fashion probably for a number of reasons: 1. audiences grew more savvy and demanded more realism; 2. lighting technology for large night exteriors improved; 3. day-for-night scenes looked extremely unconvincing when brightened up for TV broadcast. Nonetheless, it remains the only practical way to show an expansive seascape or landscape, such as the desert in Mad Max: Fury Road.
One of the big technological changes for night shooting was the availability of HMI lighting, developed by Osram in the late 1960s. With these efficient, daylight-balanced fixtures large areas could be lit with less power, and it was easy to render the light blue without gels by photographing on tungsten film stock.
Cinematic moonlight reached a peak of blueness in the late 1980s and early ’90s, in keeping with the general fashion for saturated neon colours at that time. Filmmakers like Tony Scott, James Cameron and Jan de Bont went heavy on the candy-blue night scenes.
By the start of the 21st century bright blue moonlight was starting to feel a bit cheesy, and DPs were experimenting with other looks.
Speaking of the above ferry scene in War of the Worlds, Janusz Kaminski, ASC said:
I didn’t use blue for that night lighting. I wanted the night to feel more neutral. The ferryboat was practically illuminated with warm light and I didn’t want to create a big contrast between that light and a blue night look.
The invention of the digital intermediate (DI) process, and later the all-digital cinematography workflow, greatly expanded the possibilities for moonlight. It can now be desaturated to produce something much closer to the silvery grey of reality. Conversely, it can be pushed towards cyan or even green in order to fit an orange-and-teal scheme of colour contrast.
Darius Wolksi, ASC made this remark to American Cinematographer in 2007 about HMI moonlight on the Pirates of the Caribbean movies:
The colour temperature difference between the HMIs and the firelight is huge. If this were printed without a DI, the night would be candy blue and the faces would be red. [With a digital intermediate] I can take the blue out and turn it into more of a grey-green, and I can take the red out of the firelight and make it more yellow.
My favourite recent approach to moonlight was in the Amazon sci-fi series Tales from the Loop. Jeff Cronenweth, ASC decided to shoot all the show’s night scenes at blue hour, a decision motivated by the long dusks (up to 75 minutes) in Winnipeg, where the production was based, and the legal limits on how late the child actors could work.
The results are beautiful. Blue moonlight may be a cinematic myth, but Tales from the Loop is one of the few places where you can see real, naturally blue light in a night scene.
If you would like to learn how to light and shoot night scenes, why not take my online course, Cinematic Lighting? 2,300 students have enrolled to date, awarding it an average of 4.5 stars out of 5. Visit Udemy to sign up now.
Next month, Terminator 2: Judgment Day turns 30. Made by a director and star at the peaks of their powers, T2 was the most expensive film ever at the time, and remains both the highest-grossing movie of Arnold Schwarzenegger’s career and the sequel which furthest out-performed its progenitor. It is also one of a handful of films that changed the world of visual effects forever, signalling as it did – to borrow the subtitle from its woeful follow-up – the rise of the machines.
The original Terminator, a low-budget surprise hit in 1984, launched director James Cameron’s career and cemented Schwarzenegger’s stardom, but it wasn’t until 1990 that the sequel was green-lit, mainly due to rights issues. At the Cannes Film Festival that year, Cameron handed executive producer Mario Kassar his script.
Today it’s easy to forget how risky it was to turn the Terminator, an iconic villain, an unstoppable, merciless death machine from an apocalyptic future, into a good guy who doesn’t kill anyone, stands on one leg when ordered, and looks like a horse when he attempts to smile. But Kassar didn’t balk, granting Cameron a budget ten times what he had had for the original, while stipulating that the film had to be in cinemas just 14 months later.
Even with some expensive sequences cut – including John Connor sending Kyle Reese back through time in the heart of Skynet HQ, a scene that would ultimately materialise in Terminator Genisys – the script was lengthy and extremely ambitious. Beginning on October 8th, 1990, the shooting schedule was front-loaded with effects shots to give the maximum time for CGI pioneers Industrial Light and Magic to realise the liquid metal T-1000 (Robert Patrick).
To further ease ILM’s burden, every trick in the book was employed to get T-1000 shots in camera wherever possible: quick shots of the villain’s fight with the T-800 (Schwarzenegger) in the steel mill finale were done with a stuntman in a foil suit; a chrome bust of Patrick was hand-raised into frame for a helicopter pilot’s reaction shot; the reforming of the shattered T-1000 was achieved by blowing mercury around with a hair dryer; bullet hits on the character’s torso were represented by spring-loaded silver “flowers” that burst out of a pre-scored shirt on cue.
Stan Winston Studio also constructed a number of cable-controlled puppets to show more extensive damage to the morphing menace. These included “Splash Head”, a bust of Patrick with the head split in two by a shotgun blast, and “Pretzel Man”, the nightmarish result of a grenade hit moments before the T-1000 falls to its doom in the molten steel.
Traditional models and rear projection are used throughout the film. A few instances are all too obvious to a modern audience, but most still look great and some are virtually undetectable. Did you know that the roll-over and crash of the cryo-tanker were shot with miniatures? Or that the T-800 plucking John off his bike in the drainage channel was filmed against a rear projection screen?
Plenty of the action was accomplished without such trickery. The production added a third storey to a disused office building near Silicon Valley, then blew it up with 100 gallons of petrol, to show the demise of Cyberdyne Systems. DP Adam Greenberg lit 5.5 miles of freeway for the car chase, and pilot Chuck Tamburro really did fly the T-1000’s police helicopter under a 20ft underpass.
Chaotic, confusing action scenes are the norm today, but it is notable that T2’s action is thrilling yet never unclear. The film sends somewhat mixed messages though, with its horrific images of nuclear annihilation and the T-800’s morality lessons from John juxtaposed with indulgent violence and a reverence for firearms. “I think of T2 as a violent movie about world peace,” Cameron paradoxically stated. “It’s an action movie about the value of human life.”
Meanwhile, 25 person-years of human life were being devoted by ILM to the T-1000’s metallic morphing abilities. Assistant VFX supervisor Mark Dippé noted: “We were pushing the limits of everything – the amount of disc space we had, the amount of memory we had in the computers, the amount of CPUs we had. Each shot, even though it only lasted about five seconds on the screen, typically would take about eight weeks to complete.”
The team began by painting a 2×2” grid on a near-naked Patrick and shooting reference footage of him walking, before laser-scanning his head at the appropriately-named Cyberware Laboratory. Four separate computer models of the T-1000 were built on Silicon Graphics Iris 4Ds, from an amorphous blob to a fully-detailed chrome replica of Patrick, each with corresponding points in 3D space so that the custom software Model Interp could morph between them.
Other custom applications included Body Sock, a solution to gaps that initially appeared when the models flexed their joints, Polyalloy Shader, which gave the T-1000 its chrome appearance, and Make Sticky, with which images of Patrick were texture-mapped onto the distorting 3D model, as when he melts through a barred gate at the mental hospital.
The film’s legacy in visual effects – for which it won the 1992 Oscar – cannot be understated. A straight line can be drawn from the water tendril in Cameron’s The Abyss, through T2 to Jurassic Park and all the way on to Avatar, with which Cameron again broke the record for the highest-grossing film of all time. The Avatar sequels will undoubtedly push the technology even further, but for many Cameron fans his greatest achievement will always be Terminator 2: Judgment Day, with its perfect blend of huge stunts, traditional effects and groundbreaking CGI.
Colour temperature starts with something mysterious called a “black body”, a theoretical object which absorbs all frequencies of electromagnetic radiation and emits it according to Planck’s Law. Put simply, Planck’s Law states that as the temperature of such a body increases, the light which it emits moves toward the blue end of the spectrum. (Remember from chemistry lessons how the tip of the blue flame was the hottest part of the Bunsen Burner?)
Colour temperature is measured in kelvins, a scale of temperature that begins at absolute zero (-273°C), the coldest temperature physically possible in the universe. To convert centigrade to kelvin, simply add 273.
The surface of the sun has a temperature of 5,778K (5,505°C), so it emits a relatively blue light. The filament of a tungsten studio lamp reaches roughly 3,200K (2,927°C), providing more of an orange light. Connect that fixture to a dimmer and bring it down to 50% intensity and you might get a colour temperature of 2,950K, even more orange.
Incandescent lamps and the sun’s surface follow Planck’s Law fairly closely, but not all light sources rely on thermal radiation, and so their colour output is not dependent on temperature alone. This leads us to the concept of “correlated colour temperature”.
The correlated colour temperature of a source is the temperature which a black body would have to be at in order to emit the same colour of light as that source. For example, the earth’s atmosphere isn’t 7,100K hot, but the light from a clear sky is as blue as a Planckian body glowing at that temperature would be. Therefore a clear blue sky has a correlated colour temperature (CCT) of 7,100K.
LED and fluorescent lights can have their colour cast at least partly defined by CCT, though since CCT is one-dimensional, measuring only the amount of blue versus red, it may give us an incomplete picture. The amounts of green and magenta which LEDs and fluorescents emit varies too, and some parts of the spectrum might be missing altogether, but that’s a whole other can of worms.
The human eye-brain system ignores most differences of colour temperature in daily life, accepting all but the most extreme examples as white light. In professional cinematography, we choose a white balance either to render colours as our eyes perceive them or for creative effect.
Most cameras today have a number of white balance presets, such as tungsten, sunny day and cloudy day, and the options to dial in a numerical colour temperature directly or to tell the camera that what it’s currently looking at (typically a white sheet of paper) is indeed white. These work by applying or reducing gain to the red or blue channels of the electronic image.
Interestingly, this means that all cameras have a “native” white balance, a white balance setting at which the least total gain is applied to the colour channels. Arri quotes 5,600K for the Alexa, and indeed the silicon in all digital sensors is inherently less sensitive to blue light than red, making large amounts of blue gain necessary under tungsten lighting. In an extreme scenario – shooting dark, saturated blues in tungsten mode, for example – this might result in objectionable picture noise, but the vast majority of the time it isn’t an issue.
The difficulty with white balance is mixed lighting. A typical example is a person standing in a room with a window on one side of them and a tungsten lamp on the other. Set your camera’s white balance to daylight (perhaps 5,600K) and the window side of their face looks correct, but the other side looks orange. Change the white balance to tungsten (3,200K) and you will correct that side of the subject’s face, but the daylight side will now look blue.
Throughout much of the history of colour cinematography, this sort of thing was considered to be an error. To correct it, you would add CTB (colour temperature blue) gel to the tungsten lamp or perhaps even place CTO (colour temperature orange) gel over the window. Nowadays, of course, we have bi-colour and RGB LED fixtures whose colour temperature can be instantly changed, but more importantly there has been a shift in taste. We’re no longer tied to making all light look white.
To give just one example, Suzie Lavelle, award-winning DP of Normal People, almost always shoots at 4,300K, halfway between typical tungsten and daylight temperatures. She allows her practical lamps to look warm and cozy, while daylight sources come out as a contrasting blue.
It is important to understand colour temperature as a DP, so that you can plan your lighting set-ups and know what colours will be obtained from different sources. However, the choice of white balance is ultimately a creative one, perhaps made at the monitor, dialling through the kelvins to see what you like, or even changed completely in post-production.
Back in February 2019 I spent a long day in Black Park, a forest behind Pinewood Studio, shooting a short film called Alder for director Vanda Ladeira. A little late perhaps, but here are my reflections on the cinematography and general experience of making this experimental fairytale.
The film is about a forager (Odne Stenseth) who does not realise he is being watched by the very spirit of the forest, the titular Alder (Libby Welsh). As he cuts a sprig of holly, or steps on a mushroom, he is unknowingly causing her pain. Meanwhile a group of ghosts – Alder’s former victims? – cavort in the woodland, and strips of film made with ground-up human bone reach out from the trees to ensnare the forager.
Vanda contacted me after seeing my work on Ren: The Girl with the Mark. She was keen for Alder’s lair to have the same feel as Karn’s house in that series. We had a number of meetings to discuss the tone, visuals and the logistics of the shoot, which initially was going to take place over two days but was eventually compressed to one.
In October 2018 we conducted a recce in a forest that we ultimately weren’t able to use. I remember at the time that I was considering shooting the project on celluloid, tying in with the plot point about Alder making film from her victims’ bones. I dropped the idea after taking light readings on that recce – when it was very overcast – and realising just how dark it could be under the tree canopy.
We ultimately shot on a Blackmagic Ursa Mini and Xeen primes, provided along with the lighting kit by gaffer Jeremy Dawson. The Blackmagic sensors seem to do very well with earthy tones, as I noticed on the village set of Ren, and the Ursa rendered the browns of the bracken, the soil and the forager’s costume nicely. Jeremy also provided us with a jib which enabled us to underscore the forager’s action with some definite moves: an introductory crane down; a dramatic pull up as he drives his knife into a tree; and a frantic boom down with him as he searches for his lost compass. In Alder’s lair we kept the camera drifting from side to side or up and down to bring energy to her more static scenes.
Lighting for the forager’s scenes was all natural, with just a little bounce or negative fill from time to time to keep some shape to the image. An Artem smoke gun, operated by Claire Finn, was used on almost every shot to give the forest some life and mystery, and also to keep the backgrounds from getting too busy; the grey wall of smoke serves to fade the background slightly, keeping the eye focused on the foreground action.
As there was no dialogue, I was free to change the frame rate expressively. Examples include: over-cranking close-ups of the forager’s feet and hands in contact with nature, emphasing the sensuality of his unwitting connection to Alder; over-cranking the dance of the ghosts to make their movements even more beautiful and supernatural; and under-cranking the forager slightly to enhance his panic when he finds himself lost and surrounded.
Alder’s lair was a set built by Denisa Dumitrescu in the forest. I took broadly the same approach to lighting it as I had for the reference scene from Ren, making some holes in the branch-covered roof and shining a blinder (a bank of four LED spotlights) through it to produce dappled shafts of sunlight. On the floor around Alder were a number of candles; we beefed up the light from these by skipping an 800W tungsten lamp off a bounce board on the floor.
The biggest challenge was the meeting between the two main characters, a scene scripted for daylight which we were forced to shoot after dark due to running behind schedule. It was the longest and most important scene in the film and suddenly the cinematography had to be completely improvised. We did not have anywhere near the lighting package that a woodland night exterior normally calls for – just 800W tungsten lamps, a few LED fixtures, and a generator only powerful enough to run one of each.
What I ended up doing was putting an 800 in the background, ostensibly as a setting sun, and bouncing a blinder off poly-board as fill. We shot the whole scene through in a single handheld shot, once with smoke and once without, then picked up a few close-ups.I tried to hide the lack of light in the background by allowing the 800 to flare the lens and render the smoke almost impenetrable at times. Vanda and her editor, Tom Chandler, leant into the strange, stylised look and bravely intercut the smoky and smokeless takes. The result is much more magical and expressive than what we would have shot if we had still had daylight.
You can watch the finished film here. It won me Best Cinematographer at the New York Cinematography Awards (August 2019) and Film Craft Award: Cinematography at Play Short International Film Awards (2019).
Today filming begins on the Shakespearian feature I have been prepping since early February. All of last week was again spent in rehearsals, this time focusing on the second half of the script.
By the end of the week I had storyboarded almost the entire film, using Artemis Pro. The production designer was able to print these out and go through them looking for any backgrounds that he might not yet have dressed, or any obtrusive existing objects that should be removed. The 1st AD was also using them to help him plan, as he had not been present at rehearsals. This led to a minor panic when I erroneously included some characters in the background of a shot that those actors were not scheduled for!
Aside from producing these storyboards and getting a fantastic understanding of how all the scenes are going to be played and blocked, a big benefit of the rehearsal weeks was the opportunity to get to know the cast. Normally I have to wave a big camera in an actor’s face the first time I meet them. It’s much better to ease them and me into the process the way we’ve done on this production. A particular highlight was when the well-known lead actor performed some of the famous soliloquies – in the absence of a camera – right into my eyes.
It was a very busy week for all concerned. When the cast weren’t in rehearsals they were in costume fittings or make-up tests, or training for the sword-fight, or doing press interviews.
The gaffer started work on Wednesday, and was joined by the best boy and spark on Thursday. After loading in the equipment, their first task was to re-globe all the sconces and ceiling lights in the auditorium. Later they gelled all the emergency lights to make them dimmer and warmer in colour, ran distro to various convenient points, and cut poly-boards to size.
The camera kit also turned up on Thursday, a slightly surreal event for me after so long working in the building with just my laptop and iPhone. For a few scenes Sean wants to create a kaleidoscopic effect, so I had purchased some cheap kaleidoscope party glasses, a 6” teaching prism, and a set of crystals which can be hung off the matte box. Ironically the cheap glasses give the best effect! These will be hand-bashed in front of the lens, whereas the prism can be clamped to a noga arm for a more controlled effect.
I gave the focus puller a tour of the building so that he could start to think about monitor positions. That will definitely be a tricky aspect of the production with all the cramped backstage spaces.
I feel better-prepared now than I have ever felt going into a feature. It is such a contrast to, say, Heretiks, where I had just one week to get up to speed, and the gaffer had no prep time whatsoever. Nonetheless, there are some things you just can’t work out until the day, and that’s where the stress and excitement come from!
I’ll continue to write a blog during production, but I won’t be publishing it until the film is released. So there will be no new posts for the next few weeks, but normal service will resume in May! See you on the other side.
By the time you read this I will have entered the Covid bubble for the still-as-yet-unannounced Shakespearian film, the beginning of two weeks of full time prep before cameras finally roll.
The week just gone has been something of a calm before the storm. It started with two important Zoom meetings: one about practicals, the other about the schedule.
The first meeting involved going through all the locations with the production designer explaining what practical lamps he planned to put in each, and me sometimes asking for additional ones. Practicals are going to be a big part of our lighting, and this sort of collaboration with the art department can make a real difference between a smoothly-running shoot and a world of pain wherever you’re trying to hide film lights because you don’t have enough practical sources.
The second meeting, coming shortly after I saw the shooting schedule for the first time, was an in-depth discussion of it with the director, producer, line producer and 1st AD. Most of my concerns – other than some days which felt uncomfortably heavy, and even one or two that seemed wastefully light – were around times of day and equipment. For example, one daylight interior scene was scheduled for the end of day, when we might be losing the light. (The next day I went through it all again by myself and made sure that any night scenes scheduled for daytime could be reasonably done with blacked-out windows.)
We also talked a lot about how things could be rejigged to get as much value as possible out of the two days that we have the crane. It’s expensive, and no-one wants it sitting around while we shoot little dialogue scenes in tiny rooms. Nor do I want one or two scenes in the film to have lots of crane shots and the rest to have none; a sprinkling of them throughout the film would be preferable, though it would mean lots of costume and make-up changes.
Another draft of the script was issued , with pretty minor changes, though one extra room has been introduced, so that will need a proper recce next time I’m there. Reading through a new draft and updating my notes takes the best part of a day, and though it can sometimes feel like a chore, every reading helps me understand the story and characters better.
I did a little more shot-listing later in the week, but it will be much better and easier to do this at the rehearsals over the next fortnight, when I can see how the actors are approaching their characters and how they’re going to use the spaces. I can even take Artemis photos if it doesn’t interrupt their process too much. Roll on rehearsals!
The main event of last week’s prep was a test at Panavision of the Arri Alexa XT, Red Gemini and Sony F55, along with Cooke Panchro, Cooke Varotal, Zeiss Superspeed and Angenieux glass. More on that below, along with footage.
The week started with Zoom meetings with the costume designer, the make-up artist, potential fight choeographers and a theatrical lighting designer. The latter is handling a number of scenes which take place on a stage, which is a new and exciting collaboration for me. I met with her at the location the next day, along with the gaffer and best boy. After discussing the stage scenes and what extra sources we might need – even as some of them were starting to be rigged – I left the lighting designer to it. The rest of us then toured the various rooms of the location, with the best boy making notes and lighting plans on his tablet as the gaffer and I discussed them. They also took measurements and worked out what distro they would need, delivering a lighting kit list to production the next day.
Meanwhile, at the request of the producer, I began a shot list, beginning with two logistically complex scenes. Despite all the recces so far, I’ve not thought about shots as much as you might think, except where they are specified in the script or where they jumped out at me when viewing the location. I expect that much of the shot planning will be done during the rehearsals, using Artemis Pro. That’s much better and easier than sitting at home trying to imagine things, but it’s useful for other departments to be able to see a shot list as early as possible.
So, the camera tests. I knew all along that I wanted to test multiple cameras and lenses to find the right ones for this project, a practice that is common on features but which, for one reason and another, I’ve never had a proper chance to do before. So I was very excited to spend Wednesday at Panavision, not far from my old stomping ground in Perivale, playing around with expensive equipment.
Specifically we had: an Arri Alexa – a camera I’m very familiar with, and my gut instinct for shooting this project on; a Sony F55 – which I was curious to test because it was used to shoot the beautiful Outlander series; and a Red Gemini – because I haven’t used a Red in years and I wanted to check I wasn’t missing out on something awesome.
For lenses we had: a set of Cooke Panchros – again a gut instinct (I’ve never used them, but from what I’ve read they seemed to fit); a set of Zeiss Superspeeds – selected after reviewing my 2017 test footage from Arri Rental; a couple of Cooke Varotal zooms, and the equivalents by the ever-reliable Angenieux. Other than the Angenieux we used on the B-camera for The Little Mermaid (which I don’t think we ever zoomed during a take), I’ve not used cinema zooms before, but I want the old-fashioned look for this project.
Here are the edited highlights from the tests…
You’ll notice that the Sony F55 disappears from the video quite early on. This is because, although I quite liked the camera on the day, as soon as I looked at the images side by side I could see that the Sony was significantly softer than the other two.
So it was down to the Alexa vs. the Gemini, and the Cookes vs. the Superspeeds. I spent most of Thursday and all of Friday morning playing with the footage in DaVinci Resolve, trying to decide between these two pairs of very close contenders. I tried various LUTs, did some rough grading (very badly, because I’m not a colourist), tested how far I could brighten the footage before it broke down, and examined flares and bokeh obsessively.
Ultimately I chose the Cooke Panchros because (a) they have a beautiful and very natural-looking flare pattern, (b) the bokeh has a slight glow to it which I like, (c) the bokeh remains a nice shape when stopped down, unlike the Superspeeds’, which goes a bit geometric, (d) they seem sharper than the Superspeeds at the edges of frame when wide open, and (e) more lengths are available.
As for the zoom lenses (not included in the video), the Cooke and the Angenieux were very similar indeed. I chose the former because it focuses a little closer and the bokeh again has that nice glow.
I came very close to picking the Gemini as my camera. I think you’d have to say, objectively, it produces a better image than the Alexa, heretical as that may sound. The colours seem more realistic (although we didn’t shoot a colour chart, which was a major oversight) and it grades extremely well. But…
I’m not making a documentary. I want a cinematic look, and while the Gemini is by no means un-cinematic, the Alexa was clearly engineered by people who loved the look of film and strove to recreate it. When comparing the footage with the Godfather and Fanny and Alexander screen-grabs that are the touchstone of the look I want to create, the Alexa was just a little bit closer. My familiarity and comfort level with the Alexa was a factor too, and the ACs felt the same way.
I’m very glad to have tested the Gemini though, and next time I’m called upon to shoot something great and deliver in 4K (not a requirement on this project) I will know exactly where to turn. A couple of interesting things I learnt about it are: (1) whichever resolution (and concomitant crop factor) you select, you can record a down-scaled 2K ProRes file, and this goes for the Helium too; (2) 4K gives the Super-35 field of view, whereas 5K shows more, resulting in some lenses vignetting at this resolution.
Prep for the yet-to-be-announced Shakespearian feature continued last week. Tuesday and Wednesday saw me on Zoom calls with the producers – discussing camera kit quotes – and the costume designer. “Will we see enough of his face through this headgear?” was a question for the latter. She in turn asked how white a white coat should be, and how dark surrounding characters should be to make one person in black stand out. Difficult things to quantify, but important.
The week’s main event was another two-day recce with the director and production designer. The designer had produced beautiful and detailed mood-boards for every room, and had even started to bring in the right furniture and test paint colours. The main aim of the recce was to discuss and sign off on his decisions so that decoration and dressing could step up to full steam.
As we moved from room to room, trying to keep in story order whenever possible, the director revealed lots of his thoughts about the tone and key beats of each scene. I was pleased to find that these were largely in a similar vein to notes I had amassed on my own spreadsheet. And when they weren’t in sync, that was very useful to know at this stage! For most scenes I showed him a reference image or two, again from my spreadsheet, to double-check that we were on the same page.
We were visited during the recce by a grip who had come to see whether a crane would fit into our main location, and if so what kind of crane and whether it could achieve the shots we wanted. I had envisaged using a Giraffe like the one we had on The Little Mermaid, but the grip suggested we would be much better off with a 23ft Technocrane and a basic remote head, as this can telescope and retract rather than only sweeping around in an arc. We measured the distances to see where the camera could end up, and then I used Artemis Pro – a director’s viewfinder app – to see what framing that would translate to with various lenses. One of our most important shots should just be possible at the full extent of the arm, combined with the full range of a 25-250mm zoom.
Whether the budget can afford the crane, however, is yet to be confirmed. This week I am due to conduct camera and lens tests, and once I’ve made a decision on those then we will know what is left for fancy grip equipment!
The only other thing to happen last week was the hiring of a data wrangler. Since I lined up the 1st and 2nd ACs quite soon after my own hiring, the camera department is now complete.
I continue to saturate myself in the script for the yet-to-be-announced Shakespearian film. Some other little projects I had going on have now wrapped up, leaving me free to concentrate purely on this production, which is due to start shooting a month from now.
I spent the best part of last Monday reading a new draft of the screenplay and updating my spreadsheet of notes to reflect the changes. Going back over this spreadsheet and the script and re-evaluating them from different angles formed a signficant part of the rest of the week. On Thursday, for example, I focused on the swordfight (narrows it down, Shakespeare fans!), scouring YouTube for reference videos and noting which camera angles seem most dangerous and engaging. In fact, watching references was another big part of the week. I worked my way through the whole Godfather trilogy (above), some more episodes of Servant, bits of several action movies that have a specific type of night exterior, and a couple of the lead actor’s recent films, to see how other DPs have lit and lensed him.
At the end of the week I went back over the spreadsheet and filled in at least one idea for every scene that did not yet have an entry in its “camera” or “lighting” column. Sometimes this would be an idea for a specific shot – e.g. “angle from outside the window looking in”; sometimes it would be a general vibe for the camerawork – e.g. “close, handheld, intimate”; sometimes a specific source – e.g. “soft top-light rigged to ladder”; sometimes a more general lighting note – e.g. “group in a patch of light, surroundings dark”.
Production sent over the quotes they have received for my camera list. At least one of them was within the budget, so that’s good! This week I’ll discuss that with the producers and hopefully decide which rental house we’re going with.
Speaking of equipment, a cheap novelty optical item arrived from eBay. I used this and my iPad to shoot a very rough demonstration of how we might achieve a special effect in camera, sending the video to the director for his feedback. He liked it, and wants to add in a few more instances of it throughout the film.
Another idea I proposed was a lighting effect, for which I sent the director this video I’d found online (below). I don’t intend to do something exactly like this in the film, but I saw a way it could be modified to our story. I ended up shooting my own rough test that is closer to how I see it working in our film.
Less exciting than any of the above, but very important, was taking an online Screenskills course in Covid awareness. I’d done the Basic Awareness course already, which takes about 30 minutes including a brief quiz, but Screenskills were offering free places for HoDs on a more in-depth course, so I signed up. This consisted of a three-hour presentation about the virus, how it can spread on set and what can be done to mitigate it in various departments, followed by another quiz. I learnt a few new things and my awareness was indeed raised.