“Ren: The Girl with the Mark” – Season Two

The reason it’s been so quiet on the blog here is that I’ve insanely taken on producing a no-budget fantasy-adventure web series, Ren: The Girl with the Mark. Readers with long memories may recall I was the DP on the first season way back in 2014, and got involved with post throughout 2015 and into 2016 when it was released. Well, now I’m the showrunner!

I’ve launched a Patreon page to fund the series as an ongoing concern, and you’ll need to subscribe to read it regularly, but here are the first two entries to whet your appetite. Please consider joining our Patreon community to get exclusive behind-the-scenes access, fiction from the world of Ren and much more.

 

The STory So Far

Let me start by bringing you up to date with where we are now.

Season One of Ren: The Girl with the Mark was released in March 2016, created and written by Kate Madison and Christopher Dane, and directed by Kate. (I joined as the director of photography and ended up as part of the core team who shepherded the show through post-production.) The series went on to win 14 international awards from over 40 nominations, and today has about 14 million aggregate episode views on YouTube – an amazing response!

For one reason and another it wasn’t until 2019 that we started gearing up for Season Two. Kate and I wrote the scripts with Ash Finn and Ashram Maharaj, and in early 2020 we ran a Kickstarter to finance new episodes on a bigger scale than the first season. Sadly that Kickstarter campaign was unsuccessful, and just a few weeks later the Covid-19 pandemic reached the UK, which seemed to draw a permanent line under the project.

Cut to: six months later. It’s the second lockdown and, like a lot of people, I’m super bored. To kill some time I thought it would be fun to write a new draft of Ren Season Two. My goal was to address some problems that had been flagged up with the 2019 draft while keeping as much of the good material as possible. Pretty soon I realised that I needed to know what would happen in Season Three in order to give Season Two the right ending, so I wrote that too.

“Well, that was fun,” I thought when I had finished, and forced myself to put it away and focus on other things.

Almost two years passed. The pandemic receded. And I had an itch. A voice in the back of my head saying, “What if…?”

Finally, around September 2022, I asked Kate and Chris if they would consider letting me take the show on. I had given it some serious thought. After the 2020 Kickstarter didn’t succeed I knew that the new season would have to be made on the same small scale as the first one, with an entirely unpaid cast and crew. I also knew that no big streamer or Hollywood studio was going to come along and wave a magic wand to transform it into a big-budget production, because if that was going to happen it would have happened back in 2016. But Kate and Chris had achieved amazing things on their tiny Season One budget, thanks in no small part to a dedicated amy of volunteers, and I believed I could do the same.

Kate and Chris read my version of the script, they felt it was in keeping with the world they had created, and they trusted me to produce something that would be faithful to the legacy of Season One. Even better, they agreed to each direct an episode!

 

Kicking OFf 2023

Thanks to everyone who’s joined this community so far! We haven’t even launched it on social media yet – that’s coming later this month – so it’s great to have so many of you eager to be involved.

Things have really started to kick off on Ren Season Two in the last few weeks.

Some of you will remember Born of Hope, Kate Madison’s phenomenally successful Lord of the Rings fan film from 2009. For that film a wooden hand-cart was constructed by Mike Rudin. It then appeared a couple of times in Season One of Ren, and has been living in her front garden ever since. Over Christmas Mike picked it up and took it to his garage workshop where he’ll be refurbishing it and turning it into a Kah’Nath prison cart that features in 202 (Season Two, Episode Two) and 203 (Season Two, Episode Three)… and again in Season Three… but let’s not get ahead of ourselves!

Meanwhile Hans Goosen, who helped make the reather for Season One as well as various other props, and appeared as both a villager and a Kah’Nath soldier, is making some of the new coins in the Alathian currency. I say “new” – they were all designed for Season One by James Ewing and Christopher Dane but only the boars and kings were actually made. Hans is now completing the set with horses, stags, eagles and wolves. First though he had to work out what each one is worth to create a realistic currency system – more on that in a future lore post!

Ronin Traynor, who returns as stunt co-ordinator for Season Two, has already planned and videoed the choreography for part of the knife fight in 204.

Locations have been the biggest area of our focus, however. Whereas Season One was mostly set in Lyngarth, Ren’s village, Season Two is all about Ren and Hunter’s journey to find the Archivist. Just yesterday Ash Finn went up to the Peak District to look at a potential location for Tarik’s Mill, a place mentioned in Season One but not yet seen. We are also considering locations in South Wales and near Portsmouth as well as in Cambridgeshire, so we’re going to be racking up the miles!

We’re also looking for a studio space to base ourselves in. If anyone knows of a barn or warehouse type of building in Cambridgeshire that might be available at an affordable rate, please let me know!

“Ren: The Girl with the Mark” – Season Two

“Harvey Greenfield is Running Late”: October 2022 Pick-ups

Day 25

14 months ago production began on the comedy feature Harvey Greenfield is Running Late. Most of the editing is done, and yesterday a reduced crew assembled to shoot one final scene and few odd shots to plug holes.

The crew may have been reduced, but the cast was bigger than it’s ever been. Jonnie and the team managed to pack out Sessions House, a historic courthouse in Ely, with about 60 extras to watch Harvey (Paul Richards) present a case against Choice. Also not reduced was the shot list, an ambitious 21 set-ups to be accomplished in just a few hours. I’m not sure how many we got in the end, but we covered everything so we must have got close.

Since the budget was a dim and distant memory, I shot on Jonnie’s own Canon C200 and lenses. An important part of Harvey‘s visual grammar is the use of wide lenses for stressy scenes, with a 14mm having been the apotheosis throughout production. For this reason, but also for speed, we shot almost everything in the courthouse on Jonnie’s Samyang 14mm, swinging to an L-series 24-70mm zoom right at the end. We couldn’t get hold of a Soft/FX filter to perfectly match with principal photography, but we were able to borrow a 1/8th Black Pro Mist to provide a little diffusion at least.

Photo: Cambridge News

For lighting, Jeremy set up his Aputure 300D and 600D in an upper gallery at the side of the courtroom, firing into the wall to provide a soft side-light throughout the room. We’d hoped not to have to tweak it much from shot to shot, but it did prove necessary, not least because we needed to look up to that gallery in a couple of set-ups. I wanted to use a lot of negative fill to bring down the ambient bounce off the walls, which had evidently been repainted at some point in the recent past by someone with an Ideal Home subscription. But the 14mm doesn’t leave much room to hide things, so there was a limit to the contrast we could introduce. Adjusting the blinds over the main windows – whenever they were out of frame – became one of our major methods of controlling the light.

Once Harvey had rested his case we moved out into the carpark to get Bryan’s “manic wides”. These grotesque caricatures of the supporting characters, imagined by Harvey at the climax of the film, required each actor, in this case Alan, to deliver key lines from their earlier scenes while I shoved the 14mm lens in their face and dutch-tilted like crazy. We recreated the day-for-night shot grabbed with the limo back on Day 13, covering the car in black drapes and firing the 300D with Urban Sodium gel through a side window – orange being another symbol of stress in the movie.

The few of us that were left then regrouped at Jonnie’s house for some ADR and a handful of inserts. The probe lens got another airing to capture a macro shot of a tape recorder, and I got to double as Harvey’s hands flicking through a book. In Paul’s very last shot he was out of focus, due to a lack of continuity-matching make-up, with the book sharp in the foreground.

The final shot of all was Cat, the editor, dropping some Post-its into frame and Jonnie, clad in Harvey’s jacket, picking them up. Not a grand shot to go out on, but one that nicely sums up the collaborative, all-hands-on-deck nature of no-budget filmmaking. It’s been a fun ride.

Read all my Harvey Greenfield is Running Late posts:

“Harvey Greenfield is Running Late”: October 2022 Pick-ups

5 Things a DP Can Do to Help the VFX Department

Almost every film today has visual effects of some kind or another, be it compositing a phone screen for a couple of shots or adding a fleet of attacking spaceships and their laser blasts destroying distant CG buildings. Many smaller productions cannot afford to have a VFX supervisor on set, however, so a conscientious DP should be looking out for ways they can ensure the footage they capture is not going to cause complications or rack up extra costs down the line.

 

1. Interactive Light

VFX will often look a lot more convincing if they affect the lighting on the actors or set. This could be as simple as flashing a lamp for a gunshot that’s going to be added in post, or it could involve programming a dynamic lighting effect into a row of Astera tubes. Remember that it could be negative lighting; I once had to shoot day exterior scenes next to an alien spaceship that wasn’t really there, so I had the gaffer rig a wall of floppy flags to create its shadow.

Beware though: inaccurate interactive lighting – be it mistimed, the wrong colour or casting unrealistic shadows – is worse than none at all. I would always advise shooting a take without the interactive lighting, because even if you do it perfectly there is always the chance that the effect will be changed in post-production from what was agreed.

An unused take from “Ren: The Girl with the Mark” in which I used green interactive light to match the concept art of the VFX. The VFX colour was changed to gold in post and we were very glad we’d done a safety take without the light!

 

2. Tracking

If you are doing a moving shot to which something will be added in post, consider adding some tracking crosses into the scene. Tracking software is really good now, but it doesn’t hurt to help it along, especially if you’re dealing with a fairly featureless surface like a blank TV screen, and definitely with green screens. A simple X made of white camera tape will do the job. Be careful not to cover up any detail that will make the X hard to paint out.

 

3. Recording Mode

If you are not generally shooting at the highest quality your camera permits, consider switching up to it for VFX shots at least. This means going to RAW if you were in, say, ProRes, or increasing the bit depth, and reducing the compression ratio. The cleaner the image, the easier you make life for the VFX team, particularly when it comes to pulling keys and motion tracking.

If you’re able to increase the resolution so that there is extra image outside the frame that will help VFX with any stabilisation, artificial image shake or adjustments of the camera move they need to make once the CG elements are in.

 

4. Camera Log

This camera log from “Rory’s Way” includes extra details because a baby had to be composited into some of the shots.

Accurate information about the lens and camera is important for the VFX department. Normally your 2nd AC will be recording focal length, T-stop, white balance, ISO, shutter angle and filtration, but for VFX shots a few extra things will be useful: lens height from the ground, tilt angle (use an inclinometer app) and at least a rough focal distance.

 

5. Green Screens

There are a whole host of things to look out for when you’re shooting on blue or green screens, but the main one is lighting. You should light the screen as evenly as possible, and to the same level as your key light. Once the camera position is set, a good tip is to bring in a couple of flags just out of the sides of frame to cut as much green spill as possible off the talent, so that the VFX team can pull a clean key.

Note the tracking crosses on the green screen in this log frame from “The Little Mermaid”.
5 Things a DP Can Do to Help the VFX Department

Defying Gravity on Film

Filmmakers have used all kinds of tricks over the years to show low or zero gravity on screen, from wire work to underwater shooting, and more recently even blasting off to capture the real thing.

Many early sci-fi films simply ignored the realities of being in space. The 1964 adaptation of H. G. Wells’ The First Men in the Moon, for example, shows its Victorian astronauts walking around the “lunar” surface without any attempt to disguise the earthly gravity.

But as the space race heated up, and audiences were treated to real footage of astronauts in Earth orbit, greater realism was required from filmmakers. None met this challenge more determinedly than Stanley Kubrick, who built a huge rotating set for 2001: A Space Odyssey. The set was based on a real concept of artificial gravity: spinning the spacecraft to create centrifugal force that pushes astronauts out to the circular wall, which effectively becomes the floor. Kubrick’s giant hamster wheel allowed him to film Dr Dave Bowman (Keir Dullea) running around this circular wall.

Ron Howard chose to shoot in real weightlessness for his 1995 film Apollo 13, a dramatisation of the near-disastrous moon mission that saw astronauts Jim Lovell, Jack Swigert and Fred Haise temporarily stranded in space after an explosion in an oxygen tank. Howard and his team – including actors Tom Hanks, Kevin Bacon and Bill Paxton – took numerous flights in the KC-135 “vomit comet”. This NASA training plane flies in a steep parabola so that passengers can experience 25 seconds of weightlessness on the way down. 

612 parabolas were required for Howard to capture the pieces of the action he needed. Apparently few people lost their lunch, though minor bumps and bruises were sometimes sustained when weightlessness ended. “It was difficult to do,” said the director at the time, “but it was an extraordinary experience.” The vomit comet footage was intercut with lower-tech angles where the actors were simply standing on see-saw-like boards which grips could gently rock up and down.

For a 2006 episode of Doctor Who, “The Impossible Planet”, the production team used Pinewood Studios’ underwater stage for a brief zero-gravity sequence. MyAnna Buring’s character Scooti has been sucked out of an airlock by a possessed colleague, and the Doctor and co watch helplessly through a window as her body floats towards a black hole. Buring was filmed floating underwater, which enabled her long hair to flow out realistically, and then composited into CGI of the black hole by The Mill.

On the whole though, wire work is the standard way of portraying zero gravity, and a particularly impressive example appeared in 2010’s Inception. Director Christopher Nolan was inspired by 2001’s weightless scenes, for which Kubrick often pointed the camera straight upwards so that the suspending wires were blocked from view by the actor’s own body.

Inception sees a fight in a dreamscape – represented by a hotel corridor – becoming weightless when the dreamers go into free-fall in the real world. The scene was shot with a 100 ft corridor set suspended on end, with the camera at the bottom shooting upwards and the cast hung on wires inside. (Miniature explosions of spacecraft traditionally used a similar technique – shooting upwards and allowing the debris to fall towards the camera in slow motion.)

2013’s Gravity filmed George Clooney and Sandra Bullock in harnesses attached to motion-control rigs. Footage of their heads was then composited onto digital body doubles which could perfectly obey the laws of zero-gravity physics.

But all of these techniques were eclipsed last year by Vyzov (“The Challenge”), a Russian feature film that actually shot aboard the International Space Station. Director Klim Shipenko and actor Yulia Peresild blasted off in a Soyuz spacecraft piloted by cosmonaut Anton Shkaplerov in autumn 2021. After a glitch in the automatic docking system which forced Shkaplerov to bring the capsule in manually, the team docked at the ISS and began 12 days of photography. Another glitch temporarily halted shooting when the station tilted unexpectedly, but the filmmakers wrapped on schedule and returned safely to Earth.

At the time of writing Vyzov has yet to be released, but according to IMDb it “follows a female surgeon who has to perform an operation on a cosmonaut too ill to return to Earth immediately”. The ISS footage is expected to form about 35 minutes of the film’s final cut.

While Vyzov is not the first film to be shot in space, it is the first to put professional cast and crew in space, rather than relying on astronauts or space tourists behind and in front of camera. It certainly won’t be the last, as NASA announced in 2020 that Tom Cruise and SpaceX would collaborate on a $200 million feature directed by Doug Liman (Edge of Tomorrow, Jumper) again to be shot partly aboard the ISS. It’s possible that Vyzov was rushed into production simply to beat Hollywood to it. While realistic weightlessness is a definite benefit of shooting in space for real, the huge amount of free publicity is probably more of a deciding factor.

Defying Gravity on Film

The History of Virtual Production

Virtual production has been on everyone’s lips in the film industry for a couple of years now, but like all new technology it didn’t just appear overnight. Let’s trace the incremental steps that brought us to the likes of The Mandalorian and beyond.

The major component of virtual production – shooting actors against a large LED screen displaying distant or non-existent locations – has its roots in the front- and rear-projection common throughout much of the 20th century. This involved a film projector throwing pre-recorded footage onto a screen behind the talent. It was used for driving scenes in countless movies from North by Northwest to Terminator 2: Judgment Day, though by the time of the latter most filmmakers preferred blue screen.

Cary Grant films the crop duster scene from “North by Northwest”

The problem with blue and green screens is that they reflects those colours onto the talent. If the screen is blue and the inserted background is clear sky that might be acceptable, but in most cases it requires careful lighting and post-production processing to eliminate the blue or green spill.

Wanting to replace these troublesome reflections with authentic ones, DP Emmanuel Lubezki, ASC, AMC conceived an “LED Box” for 2013’s Gravity. This was a 20’ cube made of LED screens displaying CG interiors of the spacecraft or Earth slowly rotating beneath the characters. “We were projecting light onto the actors’ faces that could have darkness on one side, light on another, a hot spot in the middle and different colours,” Lubezki told American Cinematographer. “It was always complex.” Gravity’s screens were of a low resolution by today’s standards, certainly not good enough to pass as real backgrounds on camera, so the full-quality CGI had to be rotoscoped in afterwards, but the lighting on the cast was authentic. 

Sandra Bullock in “Gravity’s” LED box

Around the same time Netflix’s House of Cards was doing something similar for its driving scenes, surrounding the vehicle with chromakey green but rigging LED screens just out of frame. The screens showed pre-filmed background plates of streets moving past, which created realistic reflections in the car’s bodywork and nuanced, dynamic light on the actors’ faces.

Also released in 2013 was the post-apocalyptic sci-fi Oblivion. Many scenes took place in the Sky Tower, a glass-walled outpost above the clouds. The set was surrounded by 500×42’ of white muslin onto which cloud and sky plates shot from atop a volcano were front-projected. Usually, projected images are not bright enough to reflect useful light onto the foreground, but by layering up 21 projectors DP Claudio Miranda, ASC was able to achieve a T1.3-2.0 split at ISO 800. Unlike those of Gravity’s low-rez LED Box, the backgrounds were also good enough to not need replacing in post.

The set of “Oblivion” surrounded by front-projected sky backgrounds

It would take another few years for LED screens to reach that point.

By 2016 the technology was well established as a means of creating complex light sources. Deepwater Horizon, based on the true story of the Gulf of Mexico oil rig disaster, made use of a 42×24’ video wall comprising 252 LED panels. “Fire caused by burning oil is very red and has deep blacks,” DP Enrique Chediak, ASC explained to American Cinematographer, noting that propane fires generated by practical effects crews are more yellow. The solution was to light the cast with footage of genuine oil fires displayed on the LED screen.

Korean zombie movie Train to Busan used LED walls both for lighting and in-camera backgrounds zipping past the titular vehicle. Murder on the Orient Express would do the same the following year.

The hyperspace VFX displayed on a huge LED screen for “Rogue One”

Meanwhile, on the set of Rogue One, vehicles were travelling a little bit faster; a huge curved screen of WinVision Air panels (with a 9mm pixel pitch, again blocky by today’s standards) displayed a hyperspace effect around spacecraft, providing both interactive lighting and in-camera VFX so long as the screen was well out of focus. The DP was Greig Fraser, ACS, ASC, whose journey into virtual production was about to coincide with that of actor/director/producer Jon Favreau.

Favreau had used LED screens for interactive lighting on The Jungle Book, then for 2018’s The Lion King he employed a virtual camera system driven by the gaming engine Unity. When work began on The Mandalorian another gaming engine, Unreal, allowed a major breakthrough: real-time rendered, photo-realistic CG backgrounds. “It’s the closest thing to playing God that a DP can ever do,” Fraser remarked to British Cinematographer last year. “You can move the sun wherever you want.”

Since then we’ve seen LED volumes used prominently in productions like The Midnight Sky, The Batman and now Star Trek: Strange New Worlds, with many more using them for the odd scene here and there. Who knows what the next breakthrough might be?

The History of Virtual Production

The Pros and Cons of Master Shots

A master is a wide shot that covers all the action in a scene. The theory is that, should you run out of time or your lead actor suddenly gets injured or some other calamity prevents you shooting any coverage, at least you’ve captured the whole scene in a useable, if not ideal, form.

I have always been a fan of shooting masters. I remember once reading about a Hollywood film with a lot of puppets – it might have been Walter Murch’s 1985 Return to Oz – which fell seriously behind schedule. A producer or consultant was dispatched to the set to get things back on track, and concluded that part of the problem was a lack of masters. The director had been avoiding them because it was impossible to hide the puppeteers and rigging in wide shots, and instead was shooting scenes in smaller, tighter pieces. As a consequence, the cast and crew never saw the whole scene played out and struggled to understand how each piece fitted in, causing mistakes and necessitating time-consuming explanations.

For me, that’s the key benefit of masters: getting everyone on the same page so that the coverage goes faster.

A master shot of mine from “Forever Alone”, a student film I helped out on several years back 

You can dig yourself into holes if you don’t start with a wide. A small part of the set gets dressed and lit, a small part of the scene gets rehearsed, and then when you come to do the next part you realise it’s not going to fit together. A key prop that should have been in the background was forgotten because it wasn’t relevant to the first small piece; now you can’t put it in because you’ll break continuity. A light source that looked beautiful in that mid-shot is impossible to replicate in a later wide without seeing lamps or rigging. However much you might plan these things, inevitably in the heat of filming you get tunnel vision about the shot in front of you and everything else fades away. And it’s easy for a director, who has the whole film running on a cinema screen in their head, to forget that everyone else can’t see it as clearly.

Not starting with a wide also robs a DP of that vital, low-pressure time to light the whole set, getting all the sources in place that will be needed for the scene, so that re-lights for coverage can be quick and smooth. It also ties the editor’s hands somewhat if they haven’t got a wide shot to fall back on to get around problems.

So there are many benefits to masters. But lately I’ve been wondering if it’s dogmatic to say that they’re essential. I’ve worked with a few directors who have shot scenes in small, controlled pieces with great confidence and success.

Not shooting a master on “Harvey Greenfield is Running Late”. Photo: Mikey Kowalczyk

Last year I worked on a comedy that has a scene set at a school play, the main action taking place in the audience. Jonnie Howard, the director, was not interested in shooting a master of the hall showing the audience, the stage and the whole chunk of play that is performed during the action. All he wanted of the play was to capture certain, specific beats in mid-shots. He didn’t even know what was happening on stage the rest of the time. He knew exactly when he was going to cut to those shots, and more importantly that it would be funnier to only ever see those random moments. He also recognised that it was easier on the child actors to be given instructions for short takes, shot by shot, rather than having to learn a protacted performance.

Not shooting masters saved us valuable time on that film. It’s not the right approach for every project; it depends on the director, how well they’re able to visualise the edit, and how much flexibility they want the editor to have. It depends on the actors too; some are more able to break things down into small pieces without getting lost, while others always like to have the run-up of “going from the top”.

There is a halfway house, which is to rehearse the whole scene, but not to shoot it. This requires clear communication with the 1st AD, however, or you’ll find that certain actors who aren’t in the first shot are still tied up in make-up when you want to rehearse. Like any way of working, it’s always best to be clear about it with your key collaborators up front, so that the pros can be maximised, the cons can be minimised, and everyone does their best work most efficiently.

A rare master shot from “Heretiks”
The Pros and Cons of Master Shots

How to Work with Natural Light

Poppy Drayton in a scene from “The Little Mermaid” where we were blessed with beautiful evening light

Natural light can be beautiful, but it is not easy for a cinematographer to work with. Continuity, dynamic range, hardness and intensity are all potential challenges.

The most obvious difficulty with natural light is that it is forever changing. It can do stunning and unexpected things, but if you don’t move quickly it’s gone. Anyone who’s ever filmed a sunset scene and had the director push for another take after the perfect light has gone knows the disappointment it can bring.

Preparation is key. Previewing the sun path using an app like Helios Pro or Sun Seeker is essential, as is working out the blocking to make the best use of the light. For The Little Mermaid I shot a sunset scene with three actors up to their waists in the Atlantic Ocean. I had to make sure, through rehearsals on dry land, that they would end up with their backs to the sun so that I would be shooting towards it.

Shooting the ocean scene for “The Little Mermaid”

I also had a grip next to me with a poly-board to bounce some of the sunlight back into the actors’ faces. This brings us to dynamic range, the fact that there may be too much or too little difference between the brightest and darkest areas. Too much contrast is common with exteriors under direct sun, or interiors with small windows or dark walls. Too little is often the case with overcast exteriors, or interiors with large windows or white walls.

As in my Mermaid example, shadows can be filled in using a reflector, be that the 5-in-1 collapsible kind that are widely and cheaply available, a white poly-board, a frame of Ultrabounce or even a white bedsheet. These will be much less effective indoors, where you may well need to add an artificial fill light, perhaps bounced off the ceiling.

If the light is too flat, contrast can be reduced using negative fill. Anything black can be used for this – a flag, a bedsheet, or the black side of a poly-board or 5-in-1 reflector. Typically this is placed to cut the light on the side of the talent’s face nearest camera to get the most shape in the image.

A demo of negative fill from my online course, “Cinematic Lighting”, available on Udemy

Direct sun is often too hard to be flattering, particularly in closer shots. The solution is to introduce some kind of diffusion between the actor and the sun. This could be anything from a shower curtain to a 12×12’ frame of Full Silk. 5-in-1 reflectors can be stripped down to a translucent white disc that works well for tight shots.

Indoors the trouble with natural light is that there might not be enough of it. If you like what it’s doing but just need more, try setting up a soft artificial source outside the window. A bigger production will often use 12K or 18K HMIs firing into Ultrabounce, but that requires a serious rental budget and a big generator. A smaller HMI pushing through a diffusion frame won’t be quite as soft but will be much cheaper. 

If that’s not possible either, the next best thing is a soft source like an LED panel rigged indoors above the window. By having the source indoors you will lose the natural shaping of the light that the window frame gives you, but some of this can be regained by fitting a honeycomb or egg-crate.

Hard reflector

Another option is to place a hard reflector – essentially a mirror on a C-stand – outside the window and angle it to reflect the brightest part of the sky, or even direct sun, into the room. The great news for anyone working on a tight budget is that any old mirror will do, so long as you can find a way to position and angle it conveniently.

The opposite problem is one all DPs have to tackle at some point – namely direct sun coming into a room and moving across it, spoiling continuity. Choosing a north-facing location will save a lot of trouble here, otherwise flags will need to be rigged and regularly adjusted as the sun moves, unless you can move quickly enough to shoot everything before the light has noticeably changed.

Natural light can be one of the biggest challenges for a cinematographer, but also one of the greatest gifts and highest goals to emulate.

How to Work with Natural Light

Planning Camera Angles and Lighting

Discussing shots with director Kate Madison on the set of “Ren: The Girl with the Mark”. Photo: Michael Hudson

A thorough plan for shots and lighting can save lots of time on set, but no battle plan survives contact with the enemy. To what extent should a DP prepare?

How much camera angles are planned – and by whom – varies tremendously in my experience. Some directors will prepare a complete shot-list or storyboard and send it to the DP for feedback; others will keep it close to their chest until the time of shooting. Some don’t do one at all, either preferring to improvise on the day in collaboration with the DP, or occasionally asking the DP to plan all the shots alone.

A shot-list can be hard to interpret by itself, particularly if there’s a lot of camera movement. Overhead blocking diagrams, perhaps done in Shot Designer or a general graphics app, make things a lot clearer. Storyboards are very useful too, be they beautifully and time-consumingly drawn, or hastily scribbled thumbnails.

An Artemis shot from “Hamlet” using stand-ins

On a feature I shot last year, we were afforded the luxury of extensive rehearsals with the cast on location. I spent the time snapping photos with Artemis Pro, the viewfinder app, and ultimately output PDF storyboards of every scene; the 1st AD distributed these with the call-sheets every morning. That level of preparedness is rare unless complex stunts or VFX are involved, but it’s incredibly useful for all the departments. The art department in particular were able to see at a glance what they did and didn’t need to dress.

One of my unused storyboards from “The Little Mermaid”

Beware though: being prepared can kill spontaneity if you’re not careful. Years ago I directed a film that had a scene supposedly set at the top of a football stadium’s lighting tower; we were going to cheat it on a platform just a few feet high, and I storyboarded it accordingly. When we changed the location to a walkway in a brewery – genuinely 20ft off the ground – I stuck to the storyboards and ended up without any shots that showcased the height of the setting.

If the various departments have prepared based on your storyboards, not keeping to them can make you unpopular. So storyboards are a double-edged sword, and expectations should be carefully managed regarding how closely they will be adhered to.

The amount of planning that the DP puts into lighting will vary greatly with budget. On a micro-budget film – or a daytime soap like Doctors – you may not see the location until the day you shoot there. But on a high-end production shooting in a large soundstage you may have to agree a detailed lighting plot with the gaffer and pre-rigging crew days or weeks in advance.

Having enough crew to pre-rig upcoming scenes is one of the first things you benefit from as a DP moving up the ladder of budgets. Communicating to the gaffer what you want to achieve then becomes very important, so that when you walk onto the set with the rest of the cast and crew the broad strokes of the lighting are ready to go, and just need tweaking once the blocking has been done.

My lighting plan for a night exterior scene in “Exit Eve”

Blocking is usually the biggest barrier to preparedness. Most films have no rehearsals before the shoot begins, so you can never quite know where the actors will feel it is best to stand until they arrive on set on the day. So a lighting plan must be more about lighting the space than anything else, just trying to make sure there are sources in roughly the right places to cover any likely actor positions suggested by the script, director or layout of the set.

Whether a detailed lighting plan needs to be drawn up or not depends on the size and complexity of the set-up, but also how confident you feel that the gaffer understands exactly what you want. I often find that a few recces and conversations along with some brief written notes are enough, but the more money that’s being spent, the more crucial it is to leave no room for misunderstandings.

Again, Shot Designer is a popular solution for creating lighting plans, but some DPs use less specialised apps like Notability, and there’s nothing wrong with good old pencil and paper.

Overall, the best approach is to have a good plan, but to keep your eyes and mind open to better ideas on the day.

For more about apps that DPs can use to help them prep and shoot, see my article “Tools of the Trade” on britishcinematographer.co.uk.

Planning Camera Angles and Lighting

How to Light Efficiently and Minimise Changes Between Angles

Exciting title, right? It’s not the glamorous side of a DP’s job, but enabling a scene to be shot quickly is a skill which definitely has its place, as long as you balance it with creative and technical quality, of course.

When a scene has been blocked and the cast have gone off to have their make-up and costuming finished, and even the director has disappeared to make plans for future scenes, the DP is left on the set to light it. Though there is always time pressure on a film, it is at a minimum during this initial lighting period (usually for the wide shot). But once the wide is in the can, the DP is expected to move quickly when tweaking lights for the coverage, as all the cast and crew are standing around waiting for you.

So a wise DP always thinks ahead to the coverage, setting up as much as possible for it concurrently with the wide, or better still sets up the wide’s lighting so that it works for the coverage too.

If we boil things right down, light looks best when it comes in from the side or the back, not the front. A common technique is to block and/or light the scene so that the main light source, be that the real sun, a window or an artificial source, is behind the cast in the wide. Let’s imagine this from the top down with the camera at 6 o’clock, the key light at 12 o’clock, and the actors in the centre.

Because of the 180º Rule, otherwise known as the Line of Action, the camera positions for the coverage are likely to all be on the bottom half of the clock face between 3 o’clock and 9 o’clock. At either of those two positions the 12 o’clock key light is now coming in from the side, so your image still has mood.

This date scene in “Harvey Greenfield is Running Late” was cross-backlit; you can just see the second light in the top right of this photo.

Another common set-up is cross-backlight. Here you would have two lights, one at about 10:30 and the other at 1:30. These give a three-quarter backlight in the wide and a three-quarter key light in the singles.

Something basic to avoid is lights actually being in shot when you move to a new camera position. Early in my career I used to put all my lamps on stands because I didn’t know any better (or have any rigging kit to do anything else), but that means you’re forever moving them. Much better to rig things to the ceiling, or to position them outside the room shining in through doors and windows. 

Practicals lights are really helpful too, because you can get them in shot with impunity. You can save hours of pain on set by collaborating with the art department in pre-production to make sure there are enough practicals to justify light from all the angles you might need it. Put them all on dimmers and use a fast lens or high ISO and you may well find that when you change camera position you only need to dim down the frontal ones and bring up the back ones to get the shot looking nice.

A behind-the-scenes view of some of the lights we rigged in the “Heretiks” chapel.

I once had to light a scene in a medieval chapel for a horror film called Heretiks. The master was a Steadicam shot moving 360º around the set. The gaffer and I invested the time beforehand to rig numerous 300W and 650W tungsten fresnels around the tops of all the walls, connected to dimmers. (The light was motivated by numerous candles.) With a bit of practice the gaffer and sparks were able to dim each lamp as the camera passed in front of it – to avoid camera shadows and the flat look of front light – and bring them back up afterwards, so there was always a wrapping backlight. A convenient side effect was that when we moved onto conventional coverage we could light shots in seconds by turning a few dimmers down or off and others up.

DP Benedict Spence used a similar principle on the recent BBC series This is Going to Hurt; he had 250 Astera Titan tubes built into the hospital set. While this was time-consuming and expensive upfront, it meant that shots could be lit very quickly by making a few tweaks at a lighting desk. And since the tubes looked like fluorescent strip-lights, there was never any problem with getting them in shot.

Once you start shooting a scene it’s important to keep up the pace so that the cast can stay in the zone. Spending extra time in prep or when lighting the wides will pay dividends in faster coverage, giving the director more time to get the best performances and to tell the story, which is ultimately what it’s all about.

How to Light Efficiently and Minimise Changes Between Angles

The Sunny 16 Rule in Cinematography

If you’ve done much still photography, particularly on celluloid, you will probably have heard of the Sunny 16 Rule. It’s a useful shortcut for correctly exposing bright day exteriors without needing a light meter. Is it of any use in digital cinematography though? Yes, and I’ll explain how.

 

How the rule Works

Sunny 16 is very simple: if the sun is out, set your aperture to f/16 and your shutter speed denominator to the same as your ISO. For example, at ISO 100 set the shutter to 1/100th of a second. At ISO 400 set the shutter to 1/400th of a second – or 1/500th of a second, if that’s the closest option the camera permits – and so on.

You can use the rule to work out other combinations from there. Say your ISO is 100 but you want the sharper, less motion-blurred look of a 1/400th shutter. That’s two stops slower, so open the aperture from f/16 to f/8. (Check out my exposure series if this is all Dutch to you.)

The Sunny 16 Rule works because the sun outputs a constant amount of light and is a constant distance from the earth – at least constant enough to make no significant difference. The sun’s illuminance at the earth’s surface is about 10,000 foot-candles. The following formula relates illuminance (b) to f-stop (f), shutter speed (s) and ISO (i):

Using Sunny 16 in the case of ISO 100 and a shutter speed of 1/100th of a second, this formula gives us…

… 6,400 foot-candles. Less than 10,000fc, certainly, but remember this is only a rule of thumb – and one designed for film, which isn’t hurt at all by a little over-exposure. The rule probably accounts for the fact that you may want to see into the shadows a bit too. (See my article “How Big a Light Do I Need?” for explanations of illuminance and foot-candles and more on the above formula.)

Anyway, you can see from the equation why the shutter speed denominator and ISO cancel each other out if they’re the same.

 

Using the rule in cinematography

A few weeks ago when I was on the banks of the River Cam setting up for a scene in Harvey Greenfield is Running Late, my 1st AC Hamish Nichols asked which ND filter I wanted in the matte box. It was 5:30am; the sun had barely risen and certainly wasn’t high enough yet to reach me and my light meter over the trees and buildings on the horizon. But I knew that it would be hitting us by the time we turned over, and that the weather forecast was for a completely cloudless day, indeed the hottest day of the year at that time. So I was able to predict that we’d need the 2.1 ND.

How did I work this out? From the Sunny 16 Rule as follows:

  • I was shooting with a 1/50th of a second shutter interval (a 172.8° shutter angle at 24fps), so the Rule told me that f/16 (or T16) at ISO 50 would be the right exposure.
  • I was actually at ISO 800, which is four stops faster than ISO 50. (Doubling 50 four times gives you 800.)
  • I wanted to shoot at T5.6, which is three stops faster than T16.
  • That’s a total of seven stops too much light. To find the right optical density of ND filter you multiply that by 0.3, so 0.3 x 7 = 2.1. (More on this in my ND filters post.)

Everything on a film set sucks up time, so the more you know in advance, the more efficient you can be. Little tricks like this mean you don’t have to do a last-minute filter swing and waste five minutes that the director could have used for another take.

The Sunny 16 Rule in Cinematography